var/home/core/zuul-output/0000755000175000017500000000000015145660422014532 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015145667657015517 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000305112115145667536020273 0ustar corecore^oikubelet.lognc9r~DYA6ZF,-K$l"mklkcQӖHSd*?+6b}Wߟ/nm͊wqɻlOxN_ P??xI[mEy},fۮWe~7Nû/wb~1;ZxsY~ݳ( 2[$7۫j{Zw鶾z?&~|XLXlN_/:oXx$%X"LADA@@tkޕf{5Wbx=@^J})K3x~JkwI|YowS˷j̶֛]/8 N Rm(of`\r\L>{Jm 0{vR̍>dQQ.aLk~g\UlxDJfw6xi1U2 c#FD?2SgafO3|,ejoLR3[ D HJP1Ub2i]$HU^L_cZ_:F9TJJ{,mvgL;: ԓ$a;ɾ7lַ;̵3](uX|&kΆ2fb4NvS)f$UX dcю)""û5h< #чOɁ^˺b}0w8_u5Ոӄp\2dd$YLYG(#?%U?hB\;ErE& SOZXHBWy|iZ~hal\t2Hgb*t--ߖ|Hp(-J C?>:zR{܃ lM6_OފߍO1nԝG?ƥF%QV5pDVHwԡ/.2h{qۀK8yUOdssdMvw`21ɻ]/ƛ"@8(PN_,_0;o_x+Vy<h\dN9:bġ7 -Pwȹl;M@n̞Qj_P\ Q]GcPN;e7Vtś98m1<:|a+.:a4nՒ,]LF0);I$>ga5"f[B[fhT/ɾgm\Sj#3hEEH*Nf äE@O0~y[쾋t=iYhșC 5ܩa!ǛfGtzz*з 55E9Fa?Zk80ݞN|:AОNo;Ⱦzu\0Ac/T%;m ~S`#u.Џ1qNp&gK60nqtƅ": C@!P q]G0,d%1}Uhs;B~R2EL9j7e\(Uё$ PK)~U,jxQV^pΣ@Klb5)%L%7׷v] gv6دϾDK ξL1KiYLizpV:C5/=v-}҅"o ']쌕|tϓX8nJ*A*%J[T2pI1Je;s_[,Ҩ38_ь ͰM0ImY/MiVJ5&jNgBt90v߁R:~U jځU~oN9xԞ~J|dݤ߯R> kH&Y``:"s ayiBq)u%'4 yܽ yW0 -i̭uJ{KưЖ@+UBj -&JO x@}DS.€>3T0|9ē7$3z^.I< )9qf e%dhy:O40n'c}c1XҸuFiƠIkaIx( +")OtZ l^Z^CQ6tffEmDφǽ{QiOENG{P;sHz"G- >+`قSᔙD'Ad ѭj( ہO r:91v|ɛr|٦/o{C Ӹ!uWȳ)gjw&+uߕt*:͵UMQrN@fYDtEYZb4-UCqK٪L.2teB ˛"ո{Gci`du듎q+;C'16FgVlWaaB)"F,u@30YQg˾_YҊŏ#_f^ TD=VAKNl4Kš4GScѦa0 J ()¾5m'p/\խX\=z,Mw˭x:qu礛WԓL!I xӤ1(5AKRVF2ɌУլ F "vuhc=JS\kkZAY`R"Hr1]%oR[^oI]${&L8<=#0yaKL: JJl r;t#H+B|ɧJiM cm)>H=l}.^\ݧM<lu Y> XH\z:dHElL(uHR0i#q%]!=t_쾋-, vW~* ^g/5n]FhNU˿oۂ6C9C7sn,kje*;iΓA7,Q)-,=1A sK|ۜLɽy]ʸEO<-YEqKzϢ \{>dDLF amKGm+`VLJsC>?5rk{-3Ss`y_C}Q v,{*)ߎ% qƦat:D=uNvdߋ{Ny[$ {ɴ6hOI']dC5`t9:GO: FmlN*:g^;T^B0$B%C6Θ%|5u=kkN2{'FEc* A>{avdt)8|mg定TN7,TEXt+`F PsSMiI S/jﴍ8wPVC P2EU:F4!ʢlQHZ9E CBU)Y(S8)c yO[E}Lc&ld\{ELO3芷AgX*;RgXGdCgX JgX2*Ъ3:O7ǭ3ږA :}d,ZByXϯ&Ksg3["66hŢFD&iQCFd4%h= z{tKmdߟ9i {A.:Mw~^`X\u6|6rcIF3b9O:j 2IN…D% YCUI}~;XI썋Fqil><UKkZ{iqi :íy˧FR1u)X9 f΁U ~5batx|ELU:T'T[G*ݧ ؽZK̡O6rLmȰ (T$ n#b@hpj:˾ojs)M/8`$:) X+ҧSaۥzw}^P1J%+P:Dsƫ%z; +g 0հc0E) 3jƯ?e|miȄ;嶑, }t&&\5u17\I@ 5O? ʴ(aPqP-K4<'Y]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJʯrΒz+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{nɼʪ~75/nQοs d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(꧟/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"?Yg"ź)\3mf|ܔMUiU|Ym! #'ukMmQ9Blm]TO1ba.XW x6ܠ9[v35H;-]Um4mMrW-k#~fؤϋu_j*^Wj^qM `-Pc.@%=X#|ۡb1lKcj$E^%nu;v}?@Nkj)n'^52&I pѴOw4ǫJ5H 7B`j:E]`C 8蟫n'Ą6[_  'Z! ,Z.maO_Bk/m~-Qy2$?T3ͤE^긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {uVUKe$,\ܺI `Qز@UӬ@B {~}Qg?lvחzäTC 4zv)|Vy7߯@qC cN ͯ~1-b }kAn=)m 3fo˶_ XJNC5B~m3Kx6BDhvxZn8hSlz z6^Q1* _> 8A@>!a:dC<mWu[7-D[9)/*˸PP!j-7BtK|VXnT&eZc~=31mס̈'K^r,w˲vtt|,S=[qɑ)6&vד4G&%JLi[? 1A ۥ͟յt9 ",@9 P==s 0py(nWDwpɡ`i?E1Q!:5*6@q\\YWTk sspww0SZ2, uvao=\Sl Uݚu@$Pup՗з҃TXscwqRtYڢLhw KO5C\-&-qQ4Mv8pS俺kCߤ`ZnC@{GP 9::3(6e™QG7'Dff^f!8:/p6>TV*P,rq<-mOK[[ߢm=ȑt^, tJbظ&Pg%㢒\QS܁vk$}  L&T+̔6vmEl 05 D"w|=U D(C{oVa*H7MQK"<O%MTTtx袥:2JޚݶKd7UQJͮݔ\Zťz;sh4BΈ l8f(q*72"DB&&-TeD1ZrbkI%8z}ݛwu0{ѩ2ْM4tޖӫgHKT~~[= LfZ eWzRSrkICd ûQÝBsN&4KG&ƫEJި_1N`Ac2 GP)"nD&D #-aGoz%<ѡh (jF9L`fMN]eʮ"3_qOZ釋rTG_7:0@Iuʙ?&Ԕ8e,žLG"1lͧQѶGM]}yxZl 0JM"d.=`Yƚ^"J?}>8ϵq\FOXƀf qbTLhlw?8p@/u7ި!:E#s:ic.XC^wT/]n2'>^&pnapckL>2QQWo/ݳ<̍8)r`F!Woc0Xq]P)\wEZ(VҠQBT^e^0F;)CtT+{`Bh"% !.bBQPnT4ƈRa[F=3}+BVE~8R{3,>0|:,5j358W]>!Q1"6oT[ҟT;725Xa+wqlR)<#!9!籈K*:!@NI^S"H=ofLx _lp ꖚӜ3C 4dM @x>ۙZh _uoֺip&1ڙʪ4\RF_04H8@>>fXmpLJ5jRS}D ?U4x[c) ,`̔Dvckk5Ťã0le۞]o~oW>(91ݧ$uxp/Cq6Un9%ZxðvGL qG $ X:w06 E=oWlzN7st˪C:?*|kިfc]| &ب^[%F%LI<0(씖;4A\`TQ.b0NH;ݹ/n -3!: _Jq#Bh^4p|-G7|ڸ=Bx)kGU/m >6JXa5FA@ q}4BooRe&#c5t'B6Ni/~?aX9QR5'%9hb,dsPn2Y??N M<0YaXJ)?ѧ| ;&kEYhjo?BOy)O˧?G6$g!D$c=5ۄX[ു RzG:ߺ[ӏ[3frl ô ހ^2TӘUAT!94[[m۾\T)W> lv+ H\FpG)*־%NǸ*kb05 V8[l?W]^@G:{N-i bɵFWǙ*+Ss*iނL8G9?{WƱPz;| \;_D[T/BI GH8@"t*"9z%lOONRѦmDVmxюݏX}K6"Qi32\-V_kR(I-wtSJR^m{d a|y,F9$^@mdH֙toN1 < ҷBq/ ۓ,j|z6OSu;BKŨʐPqO K\{jDiy@}b|Z79ߜih(+PKO;!o\戔-QB EM;oH$$]?4~YrXY%Ο@oHwlXiW\ΡbN}l4VX|"0]! YcVi)@kF;'ta%*xU㔸,A|@WJfVP6`ڼ3qY.[U BTR0u$$hG$0NpF]\ݗe$?# #:001w<{{B\rhGg JGIެE.:zYrY{*2lVǻXEB6;5NE#eb3aīNLd&@yz\?))H;h\ߍ5S&(w9Z,K44|<#EkqTkOtW]﮶f=.*LD6%#-tңx%>MZ'0-bB$ !)6@I<#`L8턻r\Kuz*]}%b<$$^LJ<\HGbIqܢcZW {jfѐ6 QڣPt[:GfCN ILhbB.*IH7xʹǙMVA*J'W)@9 Ѷ6jىY* 85{pMX+]o$h{KrҎl 5sÁbNW\: "HK<bdYL_Dd)VpA@A i"j<鮗 qwc&dXV0e[g#B4x╙✑3'-i{SEȢbK6}{Ⱥi!ma0o xI0&" 9cT)0ߢ5ڦ==!LgdJΆmΉO]T"DĊKٙ@qP,i Nl:6'5R.j,&tK*iOFsk6[E__0pw=͠qj@o5iX0v\fk= ;H J/,t%Rwó^;n1z"8 P޿[V!ye]VZRԾ|“qNpѓVZD2"VN-m2do9 'H*IM}J ZaG%qn*WE^k1v3ڣjm7>ƽl' ,Τ9)%@ wl42iG.y3bBA{pR A ?IEY ?|-nz#}~f ‰dŷ=ɀ,m7VyIwGHέ 2tޞߛM{FL\#a s.3\}*=#uL#]  GE|FKi3&,ۓxmF͉lG$mN$!;ߑl5O$}D~5| 01 S?tq6cl]M[I5'ոfiҞ:Z YՑ"jyKWk^dd@U_a4/vvV qHMI{+']1m]<$*YP7g# s!8!ߐ>'4k7/KwΦθW'?~>x0_>9Hhs%y{#iUI[Gzďx7OnuKRv'm;/~n-KI`5-'YݦD-!+Y򼤙&m^YAKC˴vҢ]+X`iDf?U7_nMBLϸY&0Ro6Qžl+nݷ" 㬙g|ӱFB@qNx^eCSW3\ZSA !c/!b"'9k I S2=bgj쯏W?=`}H0--VV#YmKW^[?R$+ +cU )?wW@!j-gw2ŝl1!iaI%~`{Tռl>~,?5D K\gd(ZH8@x~5w.4\h(`dc)}1Kqi4~'p!;_V>&M!s}FDͳ֧0O*Vr/tdQu!4YhdqT nXeb|Ivż7>! &ĊL:}3*8&6f5 %>~R݄}WgѨ@OĹCtWai4AY!XH _pw騋[b[%/d>. !Df~;)(Oy )r#.<]]i-*ػ-f24qlT1  jL>1qY|\䛧\|r>Ch}Ϊ=jnk?p ^C8"M#Eޑ-5@f,|Ά(Շ*(XCK*"pXR[كrq IH!6=Ocnи%G"|ڔ^kПy׏<:n:!d#[7>^.hd/}ӾP'k2MؤYy/{!ca /^wT j˚ب|MLE7Ee/I lu//j8MoGqdDt^_Y\-8!ד|$@D.ݮl`p48io^.š{_f>O)J=iwwӑ؇n-i3,1׿5'odۆ3(h>1UW蚍R$W>u rFHӹ_g#hHŽpn`g䤳vB!s׭dֲcUh=Ɩ9b&2} -of;M.~dhÓ5¨LIa6PnzɗBQiG'CXt!*<0U-(qc;}*CiKe@p&Em&x!i6ٱ˭K& FCfJ9%ٕQ·BD-]R1#]TROr}S [;Zcq6xMY 6ses~}3ɱ© R$ T5%:zZ甎܋)`ŰJ38!;NfHohVbK :S50exU}W`upHЍE_f8TK&+F1%ܱce);FƟW^OiXC٩ȦD\!~s7[ NRCǔd X13։: F]agB-:%ގީ׵Oj|Yb:.͘C4z 6qe6J61R$Eh3ŕS,|HVQ6~ۮ 馏SVL l)v}Yg%1C+tkgpʼn)dvWd**hг˙r,3l'^  [}r:gGt $*}aQ.Zi~%K\rfm$%ɪq(%W>*Hg>KStE)KS1z2"h%^NEN?  hxnd/)O{,:خcX1nIaJ/t4J\bƀWc-d4M^d/ ʂK0`v%"s#PCoT/*,:[4b=]N&, ,B82^WK9EHLPm))2.9ȱ  QAcBC-|$M\^B!`}M^t+C~·ߞO;Y[Iȧ q:i Ǟ/"8Wxç,vܰtX-LE7 |-D`JLw9|fb>4Nu ߏ3ap5k_JA+A.A~ C~`[KaQ-Ģn9ѧf q:cT >to@Զ~[ن@aК(.6%D1Vga>@'@—>9V*E :lw)e6;KK{s`>3X: P/%d1ؑHͦ4;W\hx锎vgqcU!}xF^jc5?7Ua,X nʬ^Cv'A$ƝKA`d;_/EZItU*)EΑ9U(낍{gؗ,9䀕:[QcUyUrĎ XRjwwflѓV Pwg Nh֘9bu.A#O14B`.fNǖ9dɹt>UnAS`SK6OSxa3 W; [>窜̀'n 3u0_K@BS %fee}i]>̤+*l:\歶 IZ5>?H;2)N.w7?|+qU?^oå~4en\.c~X[s'gSSۘf .D s}Y,J[}jX^ޗ߬-/̍ݥ*n./cus}]\>\\^'w o7nEwWq˯_E}-&o8no!u݅l@]d_nnzΙk0hAϏa$ X )@VW)2&?ul_$,1=qOA;?U} 1^:XK \ )@+q(}* y 0< 5IDK5AhDK֠]4xMILv;Ull- }C| ]x ĉjlli˚| }xT-x.!>Zh?EV"sd!@БU ^p%pO3|B5=2怕nwRqR9~ i±za+HFNi>. EWz:V^&YEs5Ȭ N *7{!fRБBSۘ† Er/IGU}APQT]|X@<Ŗ:^&`+@lʷf© RT65Ove `tƅ)  { _ CaL=m*2*q:CIta*8 +>cΒVz]T.C$cEp._0M`AlF̤@U' u,—rw=3}resLV&ԙy=Ejl1#XX۾;R;+[$4pjfљ lݍ3)`xvcZRT\%fNV Q)nsX }plMa~;Wi+f{v%Ζ/K 8WPll{f_WJ|8(A ä>nl"jF;/-R9~ {^'##AA:s`uih F% [U۴"qkjXS~+(f?TT)*qy+QR"tJ8۷)'3J1>pnVGITq3J&J0CQ v&P_񾅶X/)T/ϧ+GJzApU]<:Yn\~%&58IS)`0効<9ViCbw!bX%E+o*ƾtNU*v-zߞϢ +4 {e6J69e5jqf$i>;V0eOޞ4ccc2J1TN.7q;"sդSP[ k`u ygz[~S [j3+sE.,uDΡ1R:Vݐ/CBc˾] shGՙf 2+);W{@dlG)%عF&4D&u.Im9c$A$Dfj-ء^6&#OȯTgرBӆI t[ 5)l>MR2ǂv JpU1cJpրj&*ߗEЍ0U#X) bpNVYSD1౱UR}UR,:lơ2<8"˓MlA2 KvP8 I7D Oj>;V|a|`U>D*KS;|:xI/ió21׭ȦS!e^t+28b$d:z4 .}gRcƈ^ʮC^0l[hl"য*6 ny!HQ=GOf"8vAq&*țTOWse~ (5TX%/8vS:w}[ą qf2Lυi lm/+Qif]>HUd6ƕ̽=T/se+ϙK$S`hnOcE(Tcr!:8UL | 8 !t Q7jk=nn7J0ܽ0{GGL'_So^ʮL_'s%mU+U+ȳlX6}i@djӃfb -u-w~ r}plK;ֽ=nlmuo[`wdй d:[mS%uTڪ?>={2])|Ը>U{s]^l`+ ja^9c5~nZjA|ЩJs Va[~ۗ#rri# zLdMl6oAMҪ1Ez&I2Ww?4.sW\zk﯊溺YTWYT\?*6eq_r/YT77WNZ7F_}/򲺺VWQ77V\_v>9"Th $LqQjiXMlk1=VzpO֠24hf 1hi D{q:v%̈#v^nBi~MefZF >:/?Ac 1M'I`22؆DT!/j璓P åiw@wgRCsTI|0]Sj%|-Ԟ = Ʃ%>H&t;9`>$& nIdE Ͻq*nŘʰNҁVwmrT 8i`6gn "D։$83dIH/m}Ig̐-DQ6VqlIO`2^c)U:ޕKj9 k&#Ať"/=66衍J!y"֣OhaQ"ڝ7wsYzi<*<L-|)h Ny <xTc5UTg4ZxZc5X=Hc5uZ*ŠX22VsA|E<%+c|X0JgӼ\^<ƩW8cf}7^@%z@ ~]R퀻@d +$E2tb4u}ݗ!P-/K*"˒5 ;cKuF@ ~𲤲ۑe,B=2 XGt 95ez#iʹc_`\ӸR=HQ\]~WgZS.!saYBev(P+!Gdu>} ,7{2YA0h'n^d$7-l|FĜ0`KO!nj/H[ٚ5/V gZHC!CJ櫐ޏ!l:5uӹKӾK\Z ?UjfۯNâU#V ,9`7tMs=PB隦Hr}=Q_>C]1>kxS%_ ר[H,k!ȏ ky.n;F2w$ cU:uzd2oA>ӕEa̴ AXR/l@#Be:0YRWC#*®K4x2Rߜџ< ke& ɬ.h > _Txk7<|mϔ~Mbi[Ѵ7N3 y * 8& 8{52.'9-h ULʥPHb:|&eCԱ "pNOUglU*gj`,DuDu6D1Ia,&q&Z鳵#:RU% _%yJUy_?l:'T`S5ITW5_A&,MMk`]BrS/.yX9Oliɧ3 ZPW7?-4c4@;Oxiy lc{;ߛp'#Ӆxm?@ \8AG9dV F4)Fr]6&T_s_O=EJk\K-KgRgre^djo1x=8ϗ ]<>[7aNPŠBX O8 ޸ 2+7*n0PTCF)~67&o(G_jGp-ʏ6߬=V&_h iR-O<$Du?ٓ<uaDۿs&7(9?_ߝzi1S# ڝC-%LF\$ kZ֩KM> x|$h9yl-mA-hI#umw$Z]`)QDߟcGf8&Dmx$t/^}9w|SBA }>w G JNf͘Hg[1Q1n*eF9#x2~ccOQʲ_A9J ~Y, $W( l,uia t 2|'㐢9FhOxS9I0hꖵ}e)k)[ť(nnT\qlҗ"5Ou▰7'R%ޞ|Yy |? Bz22!ȕ/0C!]P$IFf:_)ktFYX]x\cTN !iڜm\Ar6 Q"=#!#vLZvhUA1.<ƃz -"s~!P?Um?@NODr-ZUydC@F+p JA]V<\FH(< XoGGNc$,հ N;< וmd &^o%RT7z{)Vj/n.+uY7ﵲ+SKCہןvoq]i+ E^|C, ى-eT>]/ŋL|AXdԝN28<"'m he "jWkkR-%Mn)a#L-e-gZT5dLm"݅UnӖ.繌biC+@uW+W[Ҹv7ҵ4.Zn1E&] RYqn Sx .i8@0ɇw=$JÈ90拵^ '-:`M6Zc[7<@{ZNH/E-R0o"GN\k}p7=ߖqfj}< Vۧ"De=eDVsp];gfMwDPG8^N_^$Dwm.%ZKFXLv0D{P5Haj< V B0Z\p8NWZhM0.A!a\Nz}=yOE7_8?atY%GYݒ hroe&}ԻUս\+!vUXXα"|dA]|V+UJ! ؇ vgqJ|X2WD !>!]7RHGF"\-{KlN &@R@- $ Fvo5lmm_p,.<==cB.'/>>r%.B .F+jSN h>.R Hn˻'_=cm(/cv\,CѡIqJusrj$sF+Z[岄H2;8*w`{ap8=p=sMʶ6݀\Dk+{kT+.tזHOrӤ$g#v`X #bF5,0(u7" 't&[C5fnjV S@#hm9 #%V"\{PgmxM97fH[1ϢXV<:,Y;!.B(uTj XK9>%}qwU@ aly)n̶\״ +Bd欹ߺ4{cgud\g DT]Ǖhm!߱+/yʋ]cWm^gqb7P(J)o2}byqVֱPOdWp/!a^imsEvHTAtiU"O \2Ѷ,煮 !~k%A\BcJ}\6n~d&Y0C[?}rsE_zF!ͧ$ r]'Wh8yt0-XwNtpp:}Xi~[h] ڷ8ޡ42sbDAo*ym` 1;ce`0%3@7LJoR':T H|knt$(ÁuJX4 B(]uj0.@ѰG0AWF#D-v3 q֍$)9]tS"FK2ʻPl+Hq֎O5ԭsqFS"x6ҟt)Iv: լhaEi)h@~W});pQp;!Pץ;MŸoҊ \M`_-#)ZC< ^@f/ a;X<9&!R> ؟0񛻞Ɓu#}hK[ `8Fv-@iR 5,w=}' vhr1[qsƓx,pеv^+R&}NՉH>N52h=rqISfqڏ^6w;bvtE:aHqPVq:p`~8z 7;pr/;>; 3gJgfvNQKv_6-ؿ=HבtLr` 8g/:=URŷ:N98 dr[I*&1Y+hLӄO>_ &v _;U3:Y]w Lr.vS1|\ɠ.] E) [(xGz؝-&TpX⧈LGm)B_n;ƹ5 BNGrFޙDT+rv;r~es^ A=s =~LwetDЀ 5)r#Eg)C.3Ѹԣ]ZL"EQ l,"'4ˀҒH9Ef \9MeTATԺ|Jqm.I, 2* 1Q_]۟}Xp܌%Y:1zp@0UF70; NUr!Ty5`&( 8Ze/+hI5G*qg JEQ,O7:ˊmܴlfrC-݂K4JƠg.PrBЦec+u1ՊZ||B`ٮs*P\U@,F@{l" @nl&&ҁQBaOqIrF>^^]nӛ$uQ6-q ݠ,8O፺HZc2PtJ5lj9$gO|>+~ p7]1_}uR[X_5_~2BA8~Zvwz4pE!y?o~uT"QA5wdt\帴c:r"w_w~m)AKl8>?!fBࠫѡ:;;Pγr#Jˌ$·Q_%N344qU\0J20{8M4;T Z7h͓˓Wn -'Qd6L:7=;aS~o5 M˒"K3b7dHЅ -nѦYtNtB W .~f~>AC([^dS)'d6*N"ZՖPc]jȩIfj3mVn`ٵZv ފ>`嫵FNզZZ)lyyI滶,e)Hʘ'Ᏸ9QOY[j[d2sǃaSr|vK:MUx#>:}E/& z ~vs_6 qg7b-i:ɋ{4p AcuCq.^gӓ(whP*f2}gf%נIN; >K˷6U_5Էh=ˏQwt9<4&B Y<M8ۤ&c6F=}a0{zCh,u9Il(1#ILNbys_4Hͫ-rOYU/Ć?s,$.h߂ӷ0_O-4a~)u'V{Bat7F}1w+a`7P۞řkz(g'Qβ3-~3jhU=Lb2X6pw+jGo iQf[)v1޿hr̈́l~CTG&;\Ŕyw0 7i5SlN q5憅~~$*": ̡8 `}c," [Z7A躡 uУuٲ.l>t[m]M!gg,6pe!lAp^j֦[k;v!hPZõ7t7GmуÔNM8JTI,oKRNq8B7 ?[|&YG{m'I십Vr:xlc7c7Ȼ6:ZBQuqk^NZTlv9!xm(5 &-//U>7UB #H(݇е!)=`P#G:tq]ؽF}R{BUB=u'y΁:{V e{'=Pv lBUB=u'}{W {'?P~ |BUB='{ށz{A?$߃``B Ghp Btcˤ0L)1=Pn!ό: yӠh0?N'Ǒƃs8':s֚.]-Xx(zr)P ҟyBe20ОOs/?^i?ϊ,Cmzj-LE`Wf!Ǐev,RX)Lf+v:ã:60{x>FO/Ėd~K9FPEdA$|3 >$իqW@-!-;*@:4m(ߌ rD*@|H|0~ǞR j/`}x(r̓ U:c+l<DEᡷ[g_S*%3fy"ڭNzwX!-/۲&_:I'iM~Y:Dbfɿĝ &c~3č'c(,k) ]Zo)<%/XD"^]xⱠE0ˁqMxeI/8sGHtߣ ["F`BVhߛ _`!UBOdA"7D^xYU?3!k94ư{K+Оƕn { 5M¬x'gYR&H6fMO`NCvo;mkKg'-nTKV-52cWF+%9皬WDyz]g$Ji"2 RA譫 pQ [zB]1'9l8+9rwNx|q"\³X&΃e(ts`~edZ9R{ۨw[WxsA[B:+sx"'}.RpMj% q9尯Mns>ìh(ˏ#fc>hLP4n,ƪS]T[}/2x ~z93bEjn %G5M\YVp=babEMŖprK&=ޅ|j,F&[_0*rn}.BBvخ7.{mZwYi'Hu׍1EEג"8@mqkM9\nwԿ]C C@rg0 1o S#6) BU]9T53%R~}֭(&Ŕ_bi̺wDh-NkmڜGwmL_1O4VޑNJp֚Odq_t)Eq"6#!p8.­F^yWf`># X @ ЧErHZjqbí^2`Ȓf\~`??AgBN_<&ǯ?cY//Ur˿-VcEܼpo~<|MgOn~Ǜ?Ǜ;cN+y[_~RA(f't2;+LO|~B^ oL@GqsOp]ǯ~b73[w烞tD;j93ke'v:"si2ӕ{ӜC_> 6Krrs/U^pz)|;xgNR{gIbg_;dfncDt%g=Ag/IG)pM+Ã5 ׍Tep愶nI?zYY>0*ᴁG ^y2/aJX95E1gɒfO$Y:岍*} $v2B3f(|A&d3 ϼ)xXÍ`4#uS8e_t}4fjB+ BD?bt6͂SF$vfeƲjswOCCLfH|zR`z!R">Ȓy9 krK B{Ԍ4ăN z)\bCCk,%ӘGKpT1..G=RF/fփK,И bd#\w'3.cxh mƶj(b$]|G˅[ӭ#ь#jJc9Т-Dj%1$:{\)x'Xvև~{bmNQ`vNwIUc}ql%joEet5@<Ȃ#=wDA ;z":|a[gAH9V>ܬlvCLxY=nBgj5B[b'IdH}}C`X`%A;I95)pTzl[ċV,J@gaUeyv[d#ǧ-$eJ3'tQN%3ӆ! eR$ މLOA7!dt3tirHRü5RL]=[piЍ M02XVE^jK!F҅$ ږڌVU$LרXR\:La$]ณ$2#OËBDCD%$ƶ`Đ\:M{FSrƏ:%ͥ+-qC1zF!Y2$W zA+PWpD`I xxIxp2FD׋=MDc xk2x/EaY0P>_i#8&8H_^K4mpZtp`k^MH ZAra%Y~Y>XUʃƴ!^W&28,ߖD &P12;3Lsx)ʹCx!8/ܬZ9_hT7[zH'^ڲq aԁb ɻ/EzTb&ω-Lv `g~~O;u΍hHAY' tZ[pk nS*djTF(c-]GM`1F{YZOz ":q|p$ I"Ӛ%S2y I ~E?ڛ(X198'&֤)kШq%F[.V& x*{UBThV g~Su˫CmuBY!(j|ʁ{ n zCOj[uY C<UJ/CsD+CX`$]lI.C9pN'׋8 KpQUF0ߖD:>#rys$qp)pyUAm>&\IpeQ7 ;5xT2dso/S08-0T)*m,(aR1ԩ05Ux"cou0|n:-bgP3xVlKg1 \5ZcoCUWWcA#CHbgE"Q\}^ƺ{CǬBDTLlVdK&Fekp=Ҝ~GN*o^V i-Fj~cZl O@Xi,ec#wZI<> !F}XI2k'/IdK89np͵)I&[ Hpt#)-,`+ QpمgYTRU)z<=k%e [1K4~\~̾>4 S+7KZƒcܻt_\<ϒ<xHctq ʢIp{IхꞺ<{E5W${6߳d*ZpKd4`8 %fE2 bB.2ze$ jVUQz=/UkҀxY9_"N(0.~l$8zrԿ=jp׵UIJz{ w$8}۱4b++g5w:ڨ1=W>jͫv:5Z=Z"ٲe M{#F# $]QhIl\SVjgJ),Cdሑtcm$1ډYoLuLk] ]?3al`֟&ļ ļ"0j%t)QgO%!Ե2@/t%JZIa0@l!MSv|O(^*[d1uYX״ag0.nͯΓ`[ ZBbjNp+`V< "|aĐGzWߎH-{7x''<.M8t&9G(9t&)9g5 ?b{$:qmH7E]0>>n(p]/NTaJ 9 4;C @m胖ƇѱUpRara~aIޘDJ3Έ;\y[2/ NV 3QzgMR>K  +~J`NktLЫ:6{~Y \81,.b'qp6a3YoRM98ﶏT|I4G4}@cgP8Yi(O-g%g՚T1Y "roHpH1 whb ];=Q@EQyP'a?]i>ݑ88[=x =uP~U}FZXJ<Ě T(yaİ{2p4HSaVZu 1qK+ VϪL]PFᇬ5xfE1+?AfИ34{ڐƀPVFkKCmET9BKLƅ8=eZzjC֑X0{Zޭ {&@`~Vl鯇Ŗ۳%q0yٵ5ISs;rEB ^V yz15(p4Fe"ܶ*!r\7)4Kw7tBBLGHbؠEwI<;`8yi&\;m.d3S:r,[!j9@4bؚaN\ LgDj|`L=C`.~IKqUAПudU\ a aĐ[ m܏4?]-UZt^YMk"±%=/"Iw%mw]gB2L*?9+fNbKw7, y=p7!Zwfx4t1=4vV8;/'pOegr"DxrǼwkh'0]#{Hc,6P80ݓ] x/;% ZMJ{IWT*⇪ b`#DIzhOH.B;x9HfX(/\ KaJ{7_!L&˵/ h5^a|Zl"xIYd$Sm-UneH>RnW2J*F'P vB(%tzfv?㹦o-Ck;,>͉L,dvWsW> 3k_G'!dwͺV e5 Ox Q~7N[$sf/HݪOv|]?g '#3%I2 U]HW.ּh SOMJ}4⹽6k b@e~\KގFQ%`{l?~ܵk?6 lWID)gLi.*8&1LfiVKTg%Q!j[5^$W00;d# \U.z d4YᮇPg?2S0dgS<-L@@Y}jԯx_#0>%ΌMFa?nY|o{FWM~a샩uvi/ӥPQjtZqK7ǷOW_v:WrOPN_MoR{3 a f-ʜE5jM H%s2۠\q‘DEҤ? bA &&}Ieyq]'Tpj{Xc%ZkZVi@M6C ڣ  .nU@ԧZDQJTArǯ !PT1h$jC8 M$ihX{x@3RsAtbFb=< !*=JnPO;t. 4z6 5 mi**ڥ 6iEڠu!"mHңՆ: BULQ*MGoY|,e)#6) X+@X_nZ7Qt{ZGXk90ʛS`LiPBbtB5o{q5AVNZ; e _lTLRb,QXŻidI|]R*Zj)OAcq`R=au`˥*NbI~7}Uo9AzԲf>v# ]6Ie{RكSR"V{uYGq+59MmͧEޝǎV5g%n u:j#TX0ѓf#񥪘ژj1uFx%돌)(p猔(B{2Ysڈjvʎ}L(̓RT[)LR=2i *egM1Z>s̱^:j#ɌNYbZ .>JXe%<,SՖ#%"*0dz:M+!xn  /ۇ2q4WS+s.vO s [bf/%P5*ՈsC!bL.[{6\m)AH1pcߖ䵍y>Vn 8Xi[  đu('`%ǘ -C:iGYΟlS#\K1ڔ۴es={GiWb#~d́!]&e0f/k6re A|VYdyB' xxD/?6MYy@=,BzBNFvwV`fϝgUV+ ;xfiû8cI/DeM%Gvq )i~Rg<2)rFks7??^6Ut"X;@3˗o;ѻ+L-} ܌MlaTRi=65ϣ*׵MCTRnM \N^Ȧ F@j|LQC 7<*XЦx@bg6􋩨V?ցi`~sI6G}Uey (9耞\ O@wV\yWDpo}] kn]N]?gc.y@U^Qtn * qoBWLg3Jq)CL bMFZ}ǧk*அ]\MLcVG߅_ sv&Vvc/4Ԃ]K}B`^FhE":UpB;1d7ekQk[o?w!4p5So^51搅P@GVg6SP:%jvduf3i(L ]s`h!4~L=sT2f/>k1__ʷRj*) MLk$yvWk$a4_Mnf| U;b#^/Vʨ}\_5|mpXJ,dJv:TrBqpE+Q!]AA~n NiD1>8 GXkHИ1o%s0.$F[ PA,S*8C\tK#Zkiar*b/9͉Щt7T( x91IQ ULv:(%HT*q=7@NCDZιD^/F^8?"eSi1bE XPfkvhq#q١_DZ~/mRf 'F (?Ylޝ^/nj2҂i@tLcHLt (RD|PRj<]cl-;*5kQEEgcI[Y8wKJrgn eN)aW).cܢ`m0-`B𤎶:ul 漖8䚐VQ0BV=,m6*mX ki&yeeAƬFk!z{ ;lj :j$;xҖcݒA#r1l jcmCغ] Fm?Pz%[>B`E JA"RJL[i!FcH, j+9ْ21M Jg>ޫcҔ0"'k5eq~,˭N2EQV'=#=&QKN`k$sFkkiOwI!2k$T,bmSA<'nhl̤Q*iJ <CE )I[28)Mm&:t2i#q9m)0JgUBNYư0R ( bX gRlsN-ʊ?UbE5iฟV{(=t>yi/lyKCJLF2B=* Sk0$HBrΛ~kyϷ)ǁ1,LSNQ R/[=>%Y! f$? 5Ak$f`{C Zbi#MM! _`v 8CTGJ*{D:x3qE WZvo~Dv%IP*&T 00%`qJg i.PEh"#ܙ"AP]oƒWW! i ^4) \X:ˢ+JN"͐DɒC)B[&3;ٝM"%:WõZ4[p="t2yd-MY!%\i$ђ ZXXN1-'z.>GcpXH Q"($Tk43ӖƁp*$4UD`sr%Ui2]V>.+¼byfm'm1mr>SـZ i{f;)17Kl2]b`!nn Sm?WbRh7kt=AI`mer5pmݹWo*?ƐXRtB9{V0pk$ #pUhn&*c#*chHbAl2fY BR(Mc6EB2m%* 돨"qFjAxG$ hƉ4!׌8a^CjHʡh ƒEa,9*Q@j  YcY[Q/F5#vu V6"XO:>(YZe)-ù`~hɕ ƳCxJ8F0#8܈`io!+{3 PM&]~S59L#fR3`3)w:L%Cۈ(8llItX(8Fiau( Hˆ05@CFҕhF&e= SKaf7#Wu|`3jiϴA'mE8הn.0VFdrEz@{׮뱚u&Ԫ[JGڠ#sO˷x-y_$/c{l{}u<qMǣR[?V{[/.+|v;m&ۧ {Qtya?Ypl}9ϗJ]S^*4&ә&~/`ə/Nӗ`^]}ܕu߳kWsWW춳m8QcyBбY 5[ xqP_3ltW׷5"_qtfc֐܎2m@-XR0JHi`EH.5}QZig#(臡4Lۣad{3 \3 1|/0+0iM~(bo#N=Ɔ 7c\xyz1\ \QXFy* 7{^%`W+W2z|Œo T TU-OVw#28hlwV {OkP֌\m jR`J#=GaCB\n$35{zI&drb:\hF05]F22XAZ+>:Th]Boc X ?pߙߠuCl;t|S31:oî> TyT)q3ӹs;|k Zw-J%r Rc*.q$n]4(^?N'?OC]u-wzaB*v^yUD'+f r$9H%| J^I;[bI$MZo@9cC Boo|Hg?dgyr\МsrV]M1Nɀ`6\uG+4R<^֡Dmwa\15Gs+r2;OzIDbDM(d D$>KG>0y}ӆSQV1BYS9\ϋN71k*/w?. {wޔ%o*JGy 7:M!BoϠR“E/ YQ! Wpn2~A={ n$ߤ̋| yǣ; ꚺ ˛iQ<%8+@^dO,)tf)Oc2elobzZ.=ikSﺏ3(ݚK۪^0 ʍ͔G@4J_NY{h )Gm{j־>rB FJ?/!ߘYIizIgM _㦑={r˲x0AJZPWߣ܂jv *[Yj et]΄m|Utu ̵܄8v|Hqqz5SqLߧ4QĉG)j¨m8aFcQ*IG >kMW݄Za ~txL@1Ϙ֕$G7'd(PhiyBՖthF=#ۧfۮFjmΦ2ʴ•yesL18m%~֓LʉCfUJ_[/3UC"+7T>k};i>-!FP7D͔㴩d^y)<rդ4 & ^_.ylzFRmےK*6`qq/0:%1ً+TC:7ұ;/._oFB$ֿٷ[G$y9ΗUʌ_K< ?ގq clYSmO?í_Ó)鞏ng2?Ig 1ȫǿ|KH'F,9{ߢ%ȟF\ t˂Ix4kWdSK'Z1D31\+ ̄ s 2p/z((qK*ё܃}Aޢ\ވpdr_hXy,/@^1|:ʨlh{WLe5KAYt#Wy4݃8umKD`c-i~'o~d LVwka UrIck}TS*mB|y*q3bmBQSU<7ma c&ɺ8m&Ni"D[N*2qȀ~(C P0,δU-ē<ƗRW^.`Mgm7Ƀҿ64Mg ^Ub.imQ8K?& )+@ YK/JޤktD0g4 ͦ8kE/gn{iѣvWR< N0MLvMY+`nrc,~);h${;6}?u  1p܌I?t&ÜaϢUMُcw4-tCBfn44KyY^-{Ļzjb@򾅢9[4dXVWj-O/Ko>FV'_Q}S/.܌-"[۩S ';c-xSuU%TYsԷSWRywb@|܄]Lapc(ccu4ӂoQhmy(1*2:v,M''Kr/ːx`oF9g0.JQo?O|w*pqY3& p#m.y |oq M+ˇE@ /\VVz_DwmyVo>5Ay/Ie^Z5%|jo/Z_?mw] ]>H65}ϳ<| y{혖F}eޚJ n oْ+=.TDF{KgѺ:FS8rr7]6ǻ; B*[|"@d08 :10"g>M,ּ<- &O\jlP!>FQ0GNPx>S'wOOxu"]%`wS+Nƴҁر#c9|,tvӑv}^TqpXQW`bMtܺ:Dݱ3.c-|,k0?!0q^pXQ%%Hu"3M\zP0|Ӟ#~ټbׁ=gz:*YVVc䝦@ 5 | bM,΃~Opە,NRAj۝9vPJaRZ#"?Pcd1+i+vPcl.a#?jVss8rDz܈\iIy-(1Q "*$sZx122ձ٩f/ Ԃy~SnثNfR >qҢ$HPO_-7Uw{P_>ӲcZIru 77?,뛿n"D0g껛ƿ9JSQޭ𻇃gg4c΂a$ ,6ɚ1MMzh6Gp#7]A$?II e %tp`/>1cW>ǯl8 ڃ!yM,ljS;cpd,ǁd9ӹGdF-\Ѓr%3gndˏM WP^svx9iȕNF)Fg<؃^eˣ.c1clYG=NPx΢$._F :#mT yN'r؜uNUp"N3$ʒ:ؽҍVvϸq3. |,ktmf0Zʌ<;(-2*E(I^vßx XJ&HuC!`n |$wC,!^Cb:$ ^ۓ0ISn","6 $na߾0x31RbF_Ɯ.n#6eDU͚p~(i'VyOQw,ƌXbE˭j]UW5*pƌ-~Ƴܚzv.7 H J(KNYE"pQ(,wbˀJ1<J8uCc Z#)o*Eg~s'`R1'Na# >VMHn!vj #2nh$o/%f·KW.e[st C^.JxeH}R"h˜J:/OV.fP2UOiŝpФfpZˢ7Rz9ԈaCUeUDD+IR-`U&! u֛RH#Fde9]4 .˜̐͘~6/:6e݋cQaM뤞x߀VkZgow*"ٍ Ҍ#sjc8[vRN=}JKr/5P NӚq\K޺ӀNʸ8`k kNT2Dzht GnDz\ϰ:> jTkxK J$]輐 |,te}XhSpq>Ǩ_ \6MI pK#&*mUT*cF:>|:)e:F*%,6[K0܉"G>\կXCc,TuQ~Ǿ7fPnH2=T~hCAN{3n`0Bfއ4%處~ 4Җ7Aлd}F2lN*oKNroy)d)\yY!IqD Q({wN p<{qܚv}>)뢛^E7!n&peƧwk߈Pś^/nP*u=7H;ڊ  wn3TR9dj3e &Q.l˻i)|^S4?& +'˻%BϘ>?HT*Sj q2cK-@wi_E~ܘ~M! _ڦg@;!ݕ9JҨ'm }Fui\ztyF Ž}9 >K{/g"ݧ:qE\^~uNw{y?l_~5_ = !mV~"N1\YJztq88(QqFq>a7_6ٸH&WT5I +I<񵴦b.r%IuY8jx<WoM2vi67N5kBsS ='XZG : |Dt'|hoKN bhvd5 x%B24&l[7ejiHV#It򸃴G XN/Lt>8Co.qH7(k㦼5]>L)n# >v짶R񐾟o'1[8Mo-lR%qG3(oTeKvX<}碶^ݧ5| He-WbgIr19/{#+׵"aQ3V".4>cJdo649j*H!PcY'<)񷫜#`U"IT'r[ccWILLNN.Acq}9U^t,+g/ +|?h+nD*il%5-XJIJ=QYR_.!xVEBO44+DJEe TOlڃlI ܛִ]^TְioÑ(;L;o Eu9ATo?;jq4"%> +'^yb:/8KXOh-mrvW[0#;u bP~݂Tmƶ6RQvJg}NXD=^?g)x(D cHp%u J^,=.w_~6}CژR]ÇyYXmNAߴP-A[kEk ^*`%6_bg$;a{.k׃oIxct>c-]TV+U)36qU-o6 ؐױ'feՄAz7jDQm-`Rʼn)deiؖF!GDzi 7]}ίk` +Od`*DGϭ'2+ft~ S/7(qp<9M8Kk%,\Nf\WXNضxwH r3;t%CU}|%|,e.Sɧ i@ Y ,ǖFeGDzܔ{i,GOИo;Ac-erj\Vs(XZ+8Z)F PՈ6M+c*]% i9n' dzsUZS{z=P OcJ-TD<}i*cZHn?=hWºdZxqp4/ ,DyPNQڽ@R]x/.ྠtX7N\sNҽ)^:(BTAMzh̸Dp\Iqΐ+eϲI,UqpMM01!*bks0 ^=<:A]||pT4|wl w/c㣨;%\ vt"9eã ,9N1}ZΑރІLn 0E4f4vPL]H 6#r4 aGGt1BA @ww >3Q.V7wg؅K7KBBmD7'4%>oV{[ y2}6FyE887PGrxݬi-3k.uom ^[uq<<lh:74J^o ӗ 8O. ^^ }x%ˊ7R6d"`? '+D+8pN>{HT'(mS[fK[fpo&pz 0_<@O.'`<,c2npt*+ ){A$w!=i%'~X gv륃ݣoUWM!3{/_KgtrkО"xU '!x/x"}? 5H1NJ+F{w]oAh~I(Kce)wTP'^ & C0^No$RyѾͭ@T.i*`>,c-Q2]'wb KfC֫K+6droi{ SHd$K'G'zex \W]$$&x\~xg?%2乺Q\eF嶥G k''I5eL.59l{a2&HNbQzȿKCK^8tCI!ʫr˥` Rg)O+s.7 ]UXcSe.g>ƙ_5J{^hx zJeL#QP 6<㷍,YQ:Ԛ:%93HuMf){<,ahpk#>/ 6'/veLaر,K)+3%`&',P|D; =eLr"3Sd:'M-j˙B@z\WovC% w,7\عI$6ݎ\,*L;Nz|;W\fZɹ z`}da-E˱3v]KN{AW J/iBu` vv[|lT٬5py~ K3e] eDhFY-(6Ңk'B>mp>T3Xȑ)u8Fxm%tP2qLe}­[UN-ެYAC,u_ z]*C @ msJ󶬚 A O`h>,q8+4jl#E&g9s'E[3T52dmY~XeƬY.%_c!a9?,VVݽ?Y^*%v9zfwY7vqkiX`9 | yb;r鹸ކ* Q5ljp>B>j Le1AsR?3UKcP?? ,rN@ "+A>m $Jh ctKl) 20P~RTcV?ta))cE ]S׉`ZFq!p4c ,W5=c[~&G1Ki=FiA %EϸJπe91~CǐILerDvϽObAv=4tPxOŔxm戽?>_T[M-%e z5\Z1UXގi rq&o3H`6F@CY]!ԭjۢW>) rsVh}޿ݢY8&vsi]$. !C9 *VHJ+bHs\0SZbze3BrWpd_O m19h{bt^ CY.;ܦ'Cp4Xr紡AYzTҵ]nWA 7g 0bn4>jj LU8. 29 r1Cc/On,XИGBJ!-jopY b@Swyϑ2ӬH?8( aɜ;t׃y_l+ߞ׻jcE]W '6c﵈Y۾~9tAMvul_HQXj ϑRDN,B=4F!G(eJo RTp!{q];QOw|a<-160W0תeY_sgas2"R6"/hV8D5%"+]N4,h>3EުrQ-~.&W:az 8;]8NW@y:J8@3k̰F ~APsn {*.]" yۈ*NLط <1 |߬e&8DE"ȑԜ"ԼWrA=4f/0>|,ris]@z$qOg $A=4+"vjq| XQp?@I9+&^,`rP.c Q 7f@%"֦,r˜D=4N h??E> }4D=k\sb{^:a/iU0)TRڑc .$ǹq1HKoJiL;p-m_uμYWi>(ddzDe [~+UkuC_d8SM-iqeY.Ksŧ)7ov"]&cSEVբpʕQ-RY@`)1HHNuR־}B|++Ct$c)f? X69w<3$qʼnWPN{newȫ_OŋxnC}wHw8AU?5aSwE翮Wն:m΋d\'Ճ=3ؕyg*սbSt;d҈=9hj#.$r-\"p~x7Y)x[`] )De1_7恲#KG3O`gFfcC[=2CX6 }Әw;+BbU:?{ md.p9{ @ulH5=(תʉN6+ڬ|[١d]iFH:j1ۜ:ּyi%C#2sH\47'#:w E^M-0RqRȒc_Lt4OYk"6A nqyώX8+gu}ƻ(M뾆\"r|yfc`h~ǧN,ԃ;=~H#qfqÊF]!쒇|(8W%XDZHU*Y"0KE6ex  {d ]F/#d!u2CoWg;&wA}*wgƍ p@wbvv^u/ǩxΝTt.MGcA-hHHZ,`xa|F CEE(vχ=Q9] 7D0C{CQo Zֹw̘.SLծ:ҏ柷nĈC҆QXW e17Q'l>bѐ$6QׅzwT V@jސٯ+F%X/u7uc;6(2>qZ |(ۍ:qN~.b@0}CkFKt@pu]EhNz%U _ʧ#17 XYF墮Y,fXw!e,B:6pq,|`CxY7+]J NjB,/hqCo'8n15b7Nԓ6yy5 0ħ,Qt,Wפ&yrK bZOդ.:/&$}W/}JY&5'~nRC]o/?ͦS }ԣeC0I& Yd}H:vy:p>TV͓8|~iӄzqpVߠ #\|dr6WMĴðIݥWjM`_ϻOY}YLHZrH=8|81#Όt3jGwk[h{#Fou qpog8Ӄcc$u* eEtuU s$/_Qzl#6կ1rU*ǰ[ bX\/ۄ66 CH!wȝHƌ؂!@Bg )ێ$3$[ S)3{La5W,6T*"Z`ia3=ѩ썽ͶT]~i|5t 4 w 7zx c2;`"t~gP<_wB-=9'y\ß{XɍܯT:%. Zd v`%o`+Ų4ZI>5LusM?wOkƿ,&䵉 |t _95M8PNRq ~G&?4pF/$)(M+؀?&G}'Q37V{n@oO(w'&O&V\$VV>''MIƜukVO!Nfz&M)if8ɵh WP$uElσ!U5A:$q\hޚ`~ !ƚ`42`2o"paeVn]&ccq&״v|?Y>ݽ?]EvB[NZLVRG T\qPi`DKݥ:ƈcwFsܝpN%p= o7}?UAt8}Y߷!=0?<lk\^ ܖk%Hfd$L*îlJlvuxe(oѬPzNHř"8rCqKiT? K*Tj0!ϮctN5)T $E )=pC^RED0w!Fn]0Y.)zi׸b6=v;qf7ktS;on.({;QSÇ[*is//m86ۢ9~mx C5ia8ZJؿO׿꨼ WlsLq,C0 6<$€2 eE *IS 攖JZHCkul+_W@0\p NrEv4[È#nA V =:٢N4A }w"}lB# m 7KeS/e_"CD: .th~wC7in/w~| w-~3 U|t:{oDs.yaJe),&)Ԩd,z#Pos.ڜK 1f *]F. Qy?۩B,ّ<('h$ꔔ{8%VTԖ*J_+XCJګCݗ4FٗZj(R苇ؾdd;e[F Dž-1e$Aiwao]>`Rt0zxm9j=-dX W/Vo>Tr~m~up3'n'K:7ߎt lِŽy_RoL~ b&/ǫ]u_huOҸ!D ;M5S|U\JCc}3oFw!Hqt=?P^Y'跱3sx`;I$r(4QaR#Op~W VGkuy 1Vo1./Թa V!Nဈ;죳Ȯz7,91m6FH\* (. mYryL~w}]<:ħӌէ3Zd amщӛK?x jv]M4P^07nfc|8xNjz:3#c?SIY P[9-/F?|x;\_+DvWUv_gW/KXJiS!iV}NI'YFFXFPBE*uc84mNpf˾Q*=өO/%0#/be@].譲!M>UeZ>kK}8ɊӌJHHg(K iaEʗaoYSr>N| i>j(+bEHAK⌒j6WֲRcJ10J7\P;\dlrRzȧ}kƬʕ*Fw2ܨ4gpfgy4q+r誊ʸ^, py5N ظI5̜cȣ:n2;:ͱ:TR\UXTJFeֆb\G??-qv#ۆYްbR^nj˘ B))鉭d$ %Nke\m <`'p31N`:`^V0/ppk4' =|6<:5;\?}F%0Eɏ$fLEmǢ8 q4F,9(&6ψ'^CbL "z[C܇ن yf({M_K_ !ivK7 hTr!K͖0ޑ(V.ncf⩅IHcJ2"ӆF'~ۑF{tm q>{;6ވ|)`$H- ΦPj,eax1 MmFc ܺ )?G?2~F?Llo?GoMeS0 Z,:Sٟd_b%۱'_EREm$k'~%-wX9$@_=AyE OHk_ 8CI0RTт*Cc/}EWWО@q{x ,%?,&ψZG~5 riyjkc}(`*p5%{c, Ko8ȗRabHhpp<0Jqr_)ZJ[U~ +EIE(, V|V8 .Kskl .p?tQ ;& 7nɵM%Ց n/lt̋; y/+PTnZ/$%z:YBaM%s$X1p>~lˠ wLpq>n m_rjh5ԐSY?VHIZ/"ʤ5|r-$'OZ3V˔ RVRza`X>:--O mXFSBySU> /wsdӯ@ wT\d E`p<92Bvw8֖S /<Ѵ~XkanG庺5d咙dRyVL.LiDɏ߯AAْMIlgUxCu03hؖQ 1$!q|‚`$R{ቦ0N^$M&DަUV }Y[G4o1wNl֗qN>KF` άZgE-#,ϵ\5oi7^jU(eP m\WmEm#^$롸Q^חX+Q .:Q1+ib"t![7lQdBm6am 6P -Z2` zgbF2CJo&M@u!9kɯh;GIMU -ڇ>ޏ}8ed;(k}uSﲦQx1o2,rRXao&4o7pGg?6>Z+:j$]O8-eP@+6s8*0Mw %ئ41Rn Ea6쩼Uz(< ŷ"]+Tf-AQmc%Ml!_]oKE cZDbb%RFP(ABF4I5䘫 2Lѯ]pt{9!M JOFԎ7ɯiӭ$2m밐9S)$TS8( e!X|k#0R~=Hp؁}@~2NMZuumt&nu͛97'Ng/}|6|(ֱ"c#Dhܖqhlk;)s>^KL6VQrVڶ}n?blmb0}#m>̑6^̷u <nAS5u k>9NiZtM-Ûrq~qZZZuM-#)T+lfN[e.sh)dOkn^/rCu28 6 HBJ¶ ~c1bJ=\@y^kߣʠb+'U>lMq )ɸgoN5u m~ %NakAR;XddE6i-,"=bR f̄qt.h_R@L:{d BD\<*2'6b։^{a<(ulKhЄR`UWn yl_|.F_wl8 *W,No !DO\TЗAEvEPz8*fI:w>L?aE~>yXO[)f~Sy#ܴd (SzT\78M4džӄ*`JZ{IyЛw(.|F?bQ(xm/h׊ѻOcl<tZuF3_jv|{PphcvԧpԧpԧpԧSU`*6iQ''iG*IT&&I5>!֖]+I:v|k |b]攀|w% /B~h4^>2:K k{չ;4Q%iYS_ы@wrhpκ'?a DmIE8F'oƘ Mpv%)JN-WKYp[Ľ*FZEh TH^(O%G+vLj8MPܠxXӦvߊDnds?Z7<{ߎϲEI}/#G:q|1 [o^^ePxu.j5h׻`Mz>=^١G6ˋOPL%~&/߄/s?pc΁ŀͿ3_ordPTѯ7nx1d+ {=Q;mqyU4g^{tG_ !`Ã, LkB́'1$eW\,Ibc1$:h̫?Cl^♿-wr2lKd0L.@_g>+Hc66\ hMmtݷ,r{Зc! .+mVVU++%>:RE/ˠ_Z3egz>9mqA* `qM} 9dNwfVK/<Mf=N]ՏÏB>x"tlր7y8avp^:]َ]7^nqJz3pk6#Tx055n6.5j|(XoiYM p>Az>`rf>] <Ü(eW>Ŏ558 FԦEfGR$Ѓaa>pjն-|BFԌTe Ypo|x؜~"܏S{5{؂dXJ{_*:vOWp{S-48J: D]9 \|ho'Y]ɔ.A+.WX :Ĉ.`.? 醙XPRDݜwlݰG?l/}^T${*-L1;q`]=Zh#ޫ\7n:}|71oA{x6'_v YlO{֩ꡀ6KJAGTu9 tnC¦`*Fy!(hB[ೣ-(}ٟvi "ڈ8N5qX{Ƙ/-F8EYs;WX0eCԏ|\!ܽaŻ;z{n* TQ/q҂z!?~A r֗^f fZ7IY8rW7(97`{wsvm樘*rPIG OF ,-7NI15U9SxcﰦZֆ<Φec2d~| WUO^"X|8i) D|]xЧҎ\J0H :m%Eװq J)@{>?/fcHS<{C1U/\ 7(o+@&WVxnͮϭ|y2g@|DJ+ L#~ Y#[Xh`Myɟus3nDKRlkr-2BP򅢐/UEhBtESaBʘ 3T8^2)'2)"8!11Z}GXTgaKy-囂vNO{!{yr='uk Z%(%pc5 7V5Z;Jc#8ϑZ `ˈ5s _+כ s~1_x/kAnjE-NLm!QTZj1$W(hodJbʄ&vr4b$K7ۖoFc#gጤTѥ> [`*R]_ ݘ ?"uP!d^U֒έ#w]9[Bj靵qUr k"cM/* D2xCGz|\ި6`mNVmYWaRSٸ1PݫZLJӽ-a$VBﶮ3kx#c87%螁(`]'1s7DȊ. Ba]uc^XY}{" XAPU/Dկl6H:R#Ih o+n0+iY\z0{8+܌m ^ f>A9y_f \áZEQG+<-v4I;}D#l[a<B13FH,9ÊOnRB_bHC/lVC罷!ɔU䲶!mjך3Q79Z~4GBsO8z4 2f7<}.<[^ч9v䁕t6ZmL+js$?K 0Kr=U@(ywwJ0,Ԅt0aL4 ?~IVʱ! 0iFӻúQ5]j9O#!r"@\( ޾fڛOe )ކ'h0ި)gGؘ9'&F".F1&VӣSzt.{ rNu0WG꼱"B\Kej(3(SFti"C.a`UFX+ 9%4yBqe˜pE:堝p4}b_&:E0sSWZ "k Nxgb%4yp ŤpLY (Y;_&:/oӝSc!Hb%5'][od+^SbN~XA #ue~h&~T,{V?߭irNv|ưU>YBofiYB!!xO2KN#Z aN&˄dwҲR %\_] 'N؎/7wPs7pbyM\2Mj>:'5P&scգJJgénTf5Vu0Z'! >o΂?u|nt6h@XŃ+j<ҥ5W^b9q[>7kiaj1ةgJn|nrK%UM% ˿= ld^m {|, N{宇6-ٛQ4`ߞeHuqiQx 2`)ڎ&9ފ Lؓ}_=chn]}-;M^#&=nh)bfݿ[(ަU[=]CIk3f.Ff ӟ SVh٘eULZt[wKސ9z5J5<8КhC%3AH#Uz/XxX!SF%h3Kx!o!9zk *ʉĎ -keZs\+ĜdVGXO\BbS)=5Z7le#B).%K5#FE)3.bQO<\ggJ1 $rݚJMhi _#a>xT3oHj,>ys %CA)Z9q[Jrfϭ$xrr9L)Zgy2 %;zA&QԞ<dLFnV0 @Kyƙ`D>SJKֱсh 4& M[:oYj9k2D2j%|٠_!x6F1Y[Jˮ#E$(g|Hh^;|NHP!ɸRcBpC) b5Z`Zu_J- JBb,ΖA38 CK=)W9x4Vx}2Iш1M5ZGvK+18חVٜK%$q%ݘ#Cf+L'P!V vr,I-ݼjiRଵԒ[M$K&Eb :qaKD~k^G''-@"xKp5~ZB{iMz "Ok5#)ht&p{ |e:SJ0\Úₕ:&577!9ü X~L e7ANԬ5xdR{+r1y\!h|,a:5[9:LH'|Ѭ4,h"rPJ8kub|&l; d7O%C*?RsM\`bHhHXo_ x)}򟫇},YC鳓eqN>'3ġ[vYk,ܺ Tpvj*$71x'C^Ax Rt̠惞`k$4J=rItDK=JI6k$$E5"ZDXJoLjԵW: ^,pڦj3֚_ZswPfny0\jCO+~:x^/;F3>:Mna!*E1ɧxH#`dUϸ(Zk.MQh:nU8rF_9pun_9Os+ M<*7nӓeqn,EmA?C D/BMn@!v2 uƔ78UIs`BtH!rXjomTD٨hs{YyEĚ j"q큽`ptJi 7g &@jT`TE3 ƶ\Kc|x [ u6#zWK:µZPDMPctL4gx+ Lt|4MSjNVˮs~ʼn$v"4BX)n(|7tϭ @Zj?dDpn SYG,<$#Z1\ml;, vhQΤ5{BcL]go >\XGO@:znҚE'R*z8H(b=3HyDZyW~ N=`u y6tqkjۤUg;ebԊK߇H|?_nX; 5!uS?B`, "{* &xO1:'i_sF` %Ji|@? Ax$(Z3.Dr/.6oFO4TᎹs9y/'㷦*}? ղ;~L8=_k^Ƀj7$p}3gm$u۳ɐʐݶ)Ze!#} =kJ޽5\b(sm,gޅX0njxݏb7sY`Ec)뉤=JI{"iDҞHI8Ԕ۞SIQϩ9=TʞSs*_) s~.| `w;cC@r t 4r(b:e!`0̂ZsLI\Gҕb,;.s{RO@ k:WawW(͎H>tC_W&Y3.> 'dSV2uVRŏ#gWw$eЭ?2.5,nfܔm' 9V_|w޸ Q/yXyUvg'oܦ'&Edq,Zt]]]]=;Pփv=h׃v=h׃v=h׃v=h׃v,ELd)sHϔYUDxn7Kaxorq( C«0J{ aZ^2x <<>93.dO#]t=,`f?S}TĖm͓g{z}a)wn^zmNyS^;NyS1TϮs wZEV܂"bD"Qg&gcwE_#gVW3B( <.=#[^;-վGL$y3cڌd+ϩnn"-6II.?X]9U#E Y{]6ߵb]84ކ|\r.XPŝY!$yƨ†;f)eJ%k%Q)ƚV-t{|l7d6rM1)`m\4! v.yBX2¼͹˟U=/˿ZCpJEne\Oc8 Lvf} [Sggd+mͽ[Uβu[k;mߒAU'ٚھ>)rffjf> Eʪ {Tkx۠F1HR\'v转Ф\-n*v@(XX4TNEg\DdFA*_/V$2g\"Y $vʉ\$^*DyG-)a{zG 1*jEp?|,7a2ztY. jx2U%Pf8zK ӳAv2N ;d'/Qm u%,c3G{|U_x fX6qMp 'EX1'=%R%abyNpzҒJ2^ ً!{1d/Ősz~jhE`hsS2RMGʨ\D/iFtLDSAkN%KBU!O)sYfe % =(5D^qZ)ENrDU*NatpH7,xt\VˑRb&y G="ZFe)A@aLkY gLs:5UQwgi$W[y-%O6ǥtӣM';kؗi=n7MJTj92DA03CpVTT8IǚGp`nDSR#q:%'sO#Fp+  sNzOՄ{B㛾~iF?3i_tn tKM8_bCңY>$KZиy0)S+ŒWc 5}RHϬx,us^{*ҷ+N% ?\ثNO{[CRЍ ; 䑤=;)(ԝ8lt<靆.f?@0! H a 뇫QS ueПԑI||a`>Uij(Qmۋ_u&RJ%B4[;Sc*AOm>^xg`o[s~gO~}exxw0qϝ~v޽ז.iդ&\/BM=SZ^SM]~m7և:`*ozLh{MƹjżJ&放-~_u.@VVדZ Xtҽ ?{ F˱68뎓)%8?#&|5 0;çԶ(Q[X_?.\g8$/__u?~˗\b./~qp2"CNw/; @p^]붺f]SŶZ6g=]mUmNNtEVsTV`W7_'sau{þr)h&HlEdELj(JE8Jo ā,Zʙ1fY5=F3xYF0 F(%Š2EV& 2gL(cfV20H-m-f{k L]l_zmf8| vj#dCI'5$'?IyFZ(\Q~?\VX=UnhX{ڐať:呓!ݷ|}9uT^ASfu?}& L>|!~,G Y=;9W\ -sEޣV)I{!6>|g8Ii[`RdKJ+L(JDO79{c*=P;rFv$4/@h-~+U^]e.1p k[c5fۯ?=6w& Q5o]KYnjrQ-H@xe%BOY/Cb9"K@ ?h<ON1Heqc*hq F[ :aFhNP/ץǪxof@.,eao~>:7Gk~(k^G;mxRgZ(Yb;/l$WN-uJUpisiޠ 0%i9"[2ƏZGwI*b#k\m}/`9aR/1",DDꥦ:'ZH0*dH>7&",_Y'r6uߝ`us%ް%HịUh4BkM̀ɝQFG hTR?BT2m3r}ٓ%.3njxE G%9&8٣(_S+3 -^/Xȗp%/>U_Nӏ_dUKJbp:2pTv)fgLvdgLve;7 Jn&:%Aƍc+JV.eA1&@D9Gqx)i6oϋ*cB; ,Mv輨e+ 3OYIgBsw)ej\V6KcF1K)-"S23NwԵ`0ic MuWoSĮݔ;.Bߕ&@r^6O?׻yХmۘr<[iy=mC'9R!J --% } d?!E' 9BH.%OD)pLjH S5խ!a"{&CD{tgRR%nbi1g;Ja8ҲRk,™8\gԿ^/oƒ2qG햎=~uoE(~whzL>ۍٹ9Z+qͷxKX`F r+v1@U_mtmYAÈZ,f7~orMSt$d"Xd,rE9"XdH XR,rE9"X`,2T*.y @awIzQ +\Snƙ!Maҧ,PMx7pYe;۟bV%R.,,mgRARaqE:VV>3E{cp x$ )[X1c2b=685BZ"Z+p x!:ǣ1trߙ\8iaz9N`.0 $- T)$-VJ\tx=i6,sz.谳3x jlrlCKIg|S)*WD(69x#x+TvS-&x#32x5sf]0^{.C-k>4U26a+wF7[G5=~VR)yD%s yI"R:"3`E @eL3 r>0GdiKcψǴ (QHLd$&F:łKsg Hc Ji2Ӟ4)RK@z~}[ MB:6JUOBbV foɲb$gʑ79<9gYrV+gYrV+gzDQ@&7t^oЯS#'NPgTj ]xdK+.BE#5@ecieow>^FhQS֫w@y&ByYc (RY'+.rj^i95ߌci7;nl)9Xg۸ }6m夾y?힒۽d){$I)2?5wTx z.=bnO<]pjڰ5vgܝ֟FhkkjLw'c[xx][oȱ+B^V/ad{K&A`%(όןjR%dD-w-}r[ejEf;)j+xWVb~>fn5}tf=A]\N qZTlH MdeEBUs8; / sb鴶3!NRl3TW}U{|UWS BOk-n7Tk9Uwљ৷^F*;#5И3Ԍ7kL2J͘",͚aLUPL%#ucW_F/#nlݺcUe7>vL([\h>PF|ڣgܸoˠn8,1".}03-c1ON4ÜR,`3,:ʠzxU%cu.63. EZGK! {8P=d:FY^1Tk%i=Rl)+ei'!T:?o{fyRlgQpiG%D hi ׄ( sn2Lp$E›zDfBߐ%[4V xkl,R}k!O^#y5ћr])GT,eG†RcoX4f:AVKMAJ;f2Rʃp-1Øȳ346 ˬ|Mfޟ2B:dK 9~֬fG؛WM^ʑ)kN (=3TaEe+$EkÁN!D |̍TeK*i;"E Rr W*"Q* **c " HB'LO6t{@ ܘa8/! JOt~a=(9 yg0un|ܚ~ -5YXQv#`P~>ƾtUQ~lCr.+G C"rn>IA)y㧛|s:[<[W  ގN[lBrXFBp5tvc0c,~[S•8 OSbm^ ߽_eX E\ <tlJWSϏ _3+#;o9-'?(ZY>xvuxr0qV7F8}-iK3m0 (t2~Z:st?ݽ.'Yk\5-/Y6ȭtFSo?LFK8/I'zr?}/dә #ԗkG,y:J K-L̥ ?a0̆Y˯ljTll4)Ϡ$?o?~~_?1Q~v?ú̋7CGG @#uS]Cˮbt-i^~USakB([:-Wzd/گsM kNr֩ [FtY@ 22ʰRQ0/)C&>[9rfKt\@1]!c@eqc*hq F[ :aFhNPW %[ @: ]ń6-GY>U+[7w~7{usOWZk&<)}CQ Q j-kA~>7R-:d`JqFOIc]`Ygڭ=+5p0v@(;QVj/Ƣ 8̾ d!r(Â!&TƂd#"@k$;&e -ɴj~:*6nfoػBeghpn`vŜ(aHa g?8~+k ɗj41ˤG+e/_?5rwެ3ZK[B<5@\uZ`E녷]k]0)b=l˿W @y+moߜɾQҧ >u|:OߧS${xWI$j}5ɾdGZ aXUN/8BQuҗwL:c j#[łY,ѫ )jmWwh.:CzZ"61E>!Q<`Ef dY+\S|V/G3.tQā?59dWo-<],'ю~+B0jSQT V,Ue7Y+:\)u*aH~!0AKג|#+J\ H}T/QaË֯s_ )=j -Iq8pzpDhJ%B%7* (NqI;A߷3\.lag6ۍwܺ-=?pߒs񮐛1z"gD=?qk lg {IT(h7gf<~y V.X2Ͼi[g_|0 <c9涽j^:Z_[Xjֆ3[ kހ =d*?X3%Ex%Z}$ҷHIy[ ylWH[+q{iz~? B嚃&-r `[3 _ZhbYHKe}Hɩ5_[#ZU=ugf&dkb&]HF9DŽRn&x92J#"( -J))$)؎LhOV;r7v]@h-gm>}t56Uo}3%Êɦ4BڙP-[^r6u D4糧aɾ}[VO41DY2=uv,K*Ljd<)V28hA #(H5&""uǁnf/_Nh퇺uh̋6Zm.<ȝC{Fygө;0n #ܪ me;feE޺|i5,7wCqݾ9;5[h|OÛO ܚuXc7kn8]HYV"[eUo7b/%z ,H l_3\8% CÜq"ɢ6roZ% 7UcmklE50@9{-w0@H^?_=[3}lMa K9"pI.rB∪tNa:8eE<:.Nn4rE,~LT="ZFe)A@[c^#j 3@ŠϏ0pޖa|Ӭb.3g0;L# v^ }>}:||R̟o0ꦟ5ѢyyXuХVfDA3CHVT`cͣr80x)1;|cn_mis4.)TD2'Tfi%1 $Ay+1 dP)N3*MYJi]3`z;MsHy`aDHb x\1]IHs{&[,?M\ښ[Ai! CraHG$b <ԒH LzA: m0%U>8 'm$X W%XxG%Kn`\˻Oa>b)8"n7_뽖j&ur\oιk< jdJ'NNIE(9J{h_qٚO`/=A᧳J飋ʠ+G(I.Hя(lG+t30." =&w)u:@9=7J_% h;3IAȥڡD([?t\0t~y)s?}bkX} GwMteፀէr 彥QP 3P*A^,m~ N+c.dw*~rp)hՒugUDxjof.gf}Ѹ8GAc _yud+`@H^=}?8y{v8ٻ^@q Se 9r @ 4UhI\um.pA°(16Eil!GpJfMzz/}^Iҥ)#V "aa(JeqM*23g}{9cח2Cf8=gF0ǚS-L&o,†.2 D: J$WN=tr,3sɖۙMQkm>o- S$Fض9)s<|f$ I+IkLc~'IJkvVEyAUzbb/Y KRsX'Z1)`g1ŝXaʵgVV%V+$DQJRV{)R"JʥbОL(p.N2:NJaV%(qk zIF&_F e5dJ؄XSvl'u6V ~9V#UC n>;7l?]g/ToxrW<ԴE?buƹ a^R4f|y"e#gPѩ6Wns܌BZi^Ol:?{<_{`Ѿ"XD-%=)!Spn( Z_^7܋/|qy{\^f:ǠЭ̨Lՙ)&X% |$e i#o~5g!~3B-(KEm )bo@RP-70Ý%"]n $0!NI&-OzQTmڗ9b ~k̊(*ͮFĂK0Ƃf8j YS( _7.HTKs̆nSz /ЗukRLj LRx[,8Iaj?~ x /Mtk q1 V®wa w~6y ϫҪm4>zpiP Be|?K0?؛b+`/6"{D2tjTۚhWhv$[Rl+ Gکy(Ȍ}D ;|6ζm\aQ`igwi?X}EE'm$-\┋Z_2Nksp+GXKh:Zl7볏E CB 받ŒesJD8qJP[Ņ\xe*е b{=N?6ɇY8Yq"؀|sp7ރ֓~un(ًEWnԻd(}U\A$E/F>h|$|Gm5+>M7l>I6NecEq^kiQ yh!fi;f3n~|+4v$Z o@K|2LFzv k7vg\?`g. woM`uPO$ٿP;2+L+gښ-1A#HA΃/jm 3zxP|gZB2m8Y }H/'<ۃ AcR=E^)#mR #׍O8j yA=?. 6_vea0//($ }{ K&8>2!ㄠ=sJ&O+48}#V+.؍|:yߘ3]A=@2*2{u[-v6]h".n[8Nyy~2u=Y?C^ZZk@ށm4s o uP>6 >Fx困ƟwIuoCm߅?f..O.K,Jq|_ɺkE.v@ykv<֥ zҼԺxA 禒/Qu$FKֈ#Bv$PL{"xZ]P+Oi^|1&Źü؇D{E+\#`6CtĸZMoŎ숰޶vwtqztɾnGƃ{.%b 'l]ٛDc\oKE."s\qzi1#0h_PGsl G;o[~/ݴc [/491g)V7we/UB{~޼|+goF֍5[0@qVwkٞ;بp^CnjM@AﰕrDjGTi,e27)YR&jDxf\驳eK$RC"c˜d)zDMUb|NRJ>9e6A% OeoۇaQ &a(t2(~~m?%aHlC=AxSHnӮa47ވtbȪEcLʑ'@LPd2i8+*uXi#)`gf`Ǹ㙥F!D70X,F 6o; smѕ$U)S0ץ"If$Jb,%Npr =!Ayꝋ2c p|cچMB1HY@0۷eDucn Pyh%A3Œ$M)JLz$!_HRMmɁ©&`iHM )P!i5`R!##흑;0;k5\W폒gS!0ήgfFܜjч^&ڸ_O`K=A᧳J飋ʎN'}bQH?8Jo#[֭Fy&0LQt-:Z|y!^]ѳ%pD3TZ- jZj f!3ן}~]ٞ.k/]ao$zGkv`L)e?mGU3}"Avv)vdch#t?R/|Iʟs)xY=!q+r"V%h9!cGFFv֮ P]M=MM:|2T_{$Kjg5KWЩh"-xݣ*&>IcZC3v7L锰 7q=> -Q`W̆ "a?D@G ;^m0)b^ )9+ik^1]htf2)\G>%bv ݵS`HOS{9Tt#zslʾ&fD=.l_ %ܛ$*U\ZH?%Q ʏnJI4IeuDŘɔg!S21B J9)Ȭߎ%c2gOT2;-950A.+f%K ?MƩmabumMM.0]x&k|zIa) ϴd@.z75_{KSe;&mz4Qz(+NjDZy@xiP$1'G'I"ٸt/I7O "HWjxj⇫Wsmt^g>q4]}]:UpMRОiy{HwDpH>)JU#/k*t?VuTSI>TS3UZ< LnǕ!BjrIZ^B!r+LMR4ы?T7H~8 -9дc`rm5ŅC;뵻/m<_ kğj/ܺ2p JhȲ m>& ^4~1Lvc36e799ҷ979#4;B1+/\,:`G7A0-d)sj|L.]0H>U995(H?vu³ϒ3 >,+/ZmzoX=Ӿ'Ht3wm.qLP' T# P>ɼ ӵU$W8^Rrk!,'=$eSMTb҅I{H5_ju?_S*2"yoM)zAyoLb=e@YLu@|h6^Bތ3]9=ZNTn~a\^ٶ8Ig{=,Ftq wmN#\FWݹv?<~a!´ݻ?Ч{v+[[/Gʶ]}?|Iʳɍ;wCI?oyu;y6+y.yѤ嵷 _.Zpn|l\0; }nuB0?Sn="L۹rh?"Nf;XX$D3'*F&)uб jU*n{ץO]V%<]艑p\VT֙\YOBM*i"TҬG毉z1"|4D18Qy "d"TĜ7~kz(v]?2un`ӽkC_ָrO|X9[fHSܻ595 ŸPp8==, :G3Ί_J%Q3uHL$!e%0ʾ# RiTL`J@c3AA#YF&:yĂ66lJjSN!,FxT>D!J[zs/̀4TB͌ )RNTUQIkaڨ5gR əL<՞Jb,;A]`l8xsA=l!i6 lwj毧US׎s]{Mv]LDeJZ¥㉛(R)5ž+w mj-B[j38% ]9b~sFsid{ϴjuL?, `%w_N̎(-\nC?. x㵎&Ђ\d9F>_r(dl R=_Tߍo ͖fr[7WZr%m'+]^ݯ?O۹{eh7h7onz[`:b;Ⱥ\o'L/0>ߔm~ZmZpGTH_pe+Se27X&& s-F bWg6,4 ]^ L[Tm"PoPԮa?e(vM/nzh^-y%ptk/~G5q箺X݃D'8C0S=a>{zέqŞtwۇm{\jDxO,%1ӹ,l8t[\>`*5uDkWm^v;<ʻAQSqx;Z8`mw`-j>m ~#l5YJ_Vc*Ǖeܞ7MUߢe.^ m38ΪʺF] }4ۨHO?u/lWۭvؽn _Ʌuk[DZAZ@6'zCZ`0-QւY|B nE(U"W"[ym,dPdu+R.puJ\,IZ!:2D1Ľ"&霩G#,Jrk9"kԸw_Np໾P \Vx[ڵ!"L~^N5G[KZλLvgzȑ_g6oE a6`8b_6xu%$',?Ee%nYb5EUW)]1-^݄Wd^]zJ;6 &f??[勞Ey08yd/~uۻ:xZ4 N@EŠ̙3iO,p,~kf6/]L=%NTѥA{ wZ!gȬB&E$(P[~݅<0ܠwHnH&%ɼ!#@[ _ |>V{F~uI=Jn5'EXMHm بKMyENsm.I$g sr%y1nW``kN-Xc|\SL| ~EQȣd=>S弉,M.J%f*KSb[ sl=cbYD-T6hs^g 2bB9f0I+#CBӬHΪKICHM )E$+,Fk29r Jgknw6&Ύtn>&){xxs5a;f;ThjN=ߡ- S> ŒaK ϖ-UVݫ\'unvHMrhS"]u ~lG'$ `!sVRnd7ѼÆulW4wj^ng^JPOs~'Ocү-|[W0/1pד}-98ْpsȻ4Qq?>&͹}}[$/xNJk38U *hQeK8t:zO}~>J?m횜K8yҋF!UzT!h\&#AȈ$͚ 4:eS@gѩ=FC L ɹB(i2gBI6(СI5H2hYT gBϏ(0_7BY$醊VznnMl,]5e.kZ|U\rY*kr6Q98JJ F-d <)H ȘbsI]л亅fMtjcF'a AaT0 WNe,w[4P,(V8Qߚik;]Cn a%"g‘vsvn9%5* GrJ`咉+vrn\7K߻9iJwK~>}ϳޭXӓ!a.ݷn?矧c+rlWDxBo1kIږe]3Pͬ*2ژD#XF~f>УexSt}mnuɺVkj%׋V'hsFi; .O˦I_Pw! R-MwTVr'2Rm0 ףNN {/nRMieq't>&O?{Ͽ>|?O~?{PSjEOO@~}@ӶyMdgKdkz4è$\cو-z|0ـe'~Bu40~gK^$-_.&du.QLGZMtH\I=j6gjb>c.'ؤZ KXVw Qx,vJ{ +=sa=Ael7uԩo![@Cc:h`;JJ)ZX[2nud.݃/BHxAWѽQz4.Ub VWQV1_yE2u`^` D/,`\qAZ"Yqy@ s! !0~U&n+ NϮ;k|߳o,/+I 5hE zg }E,VրpP=jJ_IǁM^~Oܼ5kݼU@JrRNXOs@p W_~gYN7߽*:' `GJU=&hI(! 2LGAӶZ/K0>X ,f7'øTd,~ $obV'„T^\yA,xYٳxƂ%kd'g2lBى`2ʐly˞4V5K$޽#4nw}wM:OR!27~w/YcGCɡ.i%"F4k B2IZ>{+0tM/Ě@}fZ,D: Re7hrQ#sLW!+T\hbũ;z=KʜÄymUvm!}.&Mo** ?uzuXBFmWJsPPJ'&ZںHim#:=p;kɔQIp{c~t?lK5%H’gC@PrS<50ب3V( 2J?򿘜<$n m1 Cþrcl7:ߛc<8wJlt&UQ1f9 c\%P.61[/KsVΤsP.ͅ&Vײֵu]0."MF?@z^!?9@" d )ʼ 5A嵊Fh2a6 e:Y0pmv}EA<WJi1劀k0^ĂI ~/Rb ±8ׄ "6+11mJr2%r1$}@tȄ!Hg ՄՄe[Kk-}Zڋ=Ew潈 jvp)i?*@?!S+p>n4RRA)95bcwa[v;-;SV=v3YDC˝[G.+>5А UdVWK[!ʉyTvϱ=6ݒ+Y 9$y`&n;m$7A$dɐ   _Y9(M۱BV?4 JYr19›J%tryr(t`\fлP] n^ UܧWw=}#bǟefkso|&pn B=3f)T)V!oу2#j1b<"V'˹qr,$Og sr#ʁ-1q[8p0^^}\[rX)I~EYȳ 3׿_!2YT肪TBQa檲: -V2[X=G>BQ FƤ ZY + &itdPht`q.uLhhj~ZyOZM,ynz ^usG̖Bnm4irmvu \uG'$ `m*]QIKjZzs4іJe6{2< dE%私wvީyYx{vyf}(_}}󧛃:>;-|eh+.6՜w^t{l e(rt⹵a|s}}[$/xNJk8U *LU.QeK8t:zO}~>J4+ɗ~Q\<ĶjbY 1$Dp`$x2IJYYl 茱<:ghi$c89\%ML(:RB1 ukmHe/«xIv~YO1E2JrqEQ9SuFub9[r~@^Z>qj`Z Eͮm^!XCӣr -rl'[B9rX HE#g VT`cͣr80x)%d>Hj4OAHFǤ"<)03,*͸JD@滜qc)P -&)|t<`mi*Cc1Fɉ%A#aI勇povgfzi*Ӵ w%Fbb\x}}f?sS:3L>=?oǭ.E¯TpS5'ٯ?>ڟJAI.gW^ YLzƎ@rLMѹbLQ ;9*f/P3 *:mqkBc4JV7WCeP$(ʻ1gil|.tǟM61)Kmb~NʜKVXz*U65yצKS:[U;-ϲ;}KgUb_KWݿ'?\>.ш#5o̅<c5Tߎ킓(G?_Y ~ѷ n5]UCګᬵEaya4.`dǣM3ml]D{d[}ys攞e?_~~3 rߜ6%u( Y$o#+tg<(~rW7εEZ r6P}H%w?M{?Ç7o}D}8w?;,t* MףP%ܱTkmQ]mU]v޿߭4 [P%—*DSH;?(7"p_'6ĵ^`답룻lag-"IDeĖ/]DVe>4JE8Jo J=r/hq8+KcS#\aEGQXH0LX$̕e0Ϙh E:lav9yx3h@[BHڊ]Y_sKdCRS ^/=OЈ'Lf)į GcQL^rÜ9Q [cI"kCcѧʎO:;>}`Ey'`Rbְ0c!#aJ 쉧im(q_qTQܦ:2,"&aREb%tL:kfrd͎]Zɜlq`M {]5uDL W<H))b^978]h 13TKs>FP)hS*!c`Ueqc*hqF[ :P9R/h v֜-.Jg*B ot̷z!Oۮ߮a' z]Se]$1=_}&$,;~TT$5m;vx=9-),ub8g\t 8C }*(U2@R@9%ykD!h.-Ug%kى4t~GC!idZ$#؞^i'db}K.b>Ԕ0LW tG5E.y' (S/rC]șu$605gB FMmfy,gSP !Oi<~XhRv˟oPbIhVb6jEU|elڣU9G\*cP9;N -WA:#AJxH*B)|D80$5I<PD|0aHB9U {+%Rqg p>-c9ɝ.'=dgq.3@N7Yxd>Z >RL@+v}oW12#XP1g&p]X/Ŕ]w|.xu[$.n W_rzo 0} J^%yI=R:" |@oe E.qVq̀`1 H!=9AQL- ) 2GN1] CX/cZad@Pg($^2N Y$53Gc]`#GZ}"E#l~aws-'BRF rlR'Y.ix`,(3xn299S~SNc,uJyd'4E85*(U1fD3T !rrg+t@Hmf?9HysQ b]u՜-^->pVE5kڞxiJμ0\'=/3h{vLHhKϛIs>!*fIsc̱|6 IKZ3|%g9X$IfEϿvfa1,XgizoÝL媐yDl1d󑂟.T6d(KԿe4Q6 rJڦO%(}UvG39 ,ˇ5U5QԂC֚ҞR=RH1FLO.G(\!r -%& ?C>MQ(ĉ Brag*lcRTQ2tq'2.xL1Two’E*L<tư.p8%b"bc1g;JaظOw֜&Љ4ZMAl_J1W)qp޲}S 鏎7׆3W[B<5@r#g^hM uZ`u/P#{Sa["lH+awV:+my 7 •AstUᨃ V:X9F斠[ƽI޻xzȩPNp3K7̇KlMq97%}ToX>ld͊OJ^f~bg0b H>H=,,1"]z+C՝$Xڈp`SꞂԵOt7qD+h?~Ek"`tY7x2@;-ΚΗf>nt nJhA5--y]xz\>.K#s2{`|>`.{ՃAq1qOzuav)ԨԮinn LFvdmq uy.fZܒ tp4 n].GRRIU_.W/aD=ح${qۡبλeF&U 6.&չl-T,p_evߘRJLTi(ǡYSQc WJ!C g[ڃ %ۣ w㑃M?x1g٫*`Za:7wݏ@kd~ j%mt"g*IP#Gs)$%^^]QLQ.ΖFbutAjFZ ?[!&gUQflW1=#FZoo'ITExy;<7ΌRjrFRg4uC-xIka$F?Qgah:R>[Ƣc\88t Z! _\0]Yq&x&+> &>V7d!4iU@eSn/G1o%HVq?>7g=J4kKtTT*N1\|uőо^}|f|97P3>VA5~Z(f4Z1OE#ͬ@v1; r><&f1gr:F q =mKrU=ܔnbrwũ^#X\i7\l\Օqa&`5_|q^1,㶩\= QX1WZi}Spi:Ɛ-u, 0cTvSaBTǶ6LWbjSNhDrBn mCRjص&xOV9wC`NE̔ Fӳ4%T}5+$ANMvOxLsY)Ttyqӷ,wMgz=7;(C{Emׅ74末{PI5òޅ*Om'7:wƯ]juO'輣Ow v?i^qYr01/Hٳ*HS8+ sʮ>t>gwq/N)DuOFXay6舗h^KRx(][YPv;ba>a xm^,XZ {EmaEFV/!b!i+ =X=7Q*!UCw.gXK$5bz`(j5a%72)uhQ&de݌O$?IR cr?{Fٶyg@rٙv^&,pA@H[YZx<OuX[-[@GjUůuyoVͧDw/$5]@F]T ޣFV佳hiY)iBSiy;"$OB! 3%pӮ(u&ר#HlJ>ɂO<1db(VDcvG N=D|i֐t賑ҁiᬏZ%cR0z _P#Jz7({nҤ` ohG^oDV wKk\'p*sC!Ikq&Τ=]?ըM kMۆƅv)6iҒd-+i=eَ۴֩{B]2֫\,5`-JR:`p"'*L$9sDPҴρ+k}qZmE?FO)VM*{'6;}<#I7W>G/ q5at*6ȜiT&E V Ex{h!?;U!|t_ <2#(MI"2fEpFs$`唿t~HuOYKI>ݶ#6z|6M(Y>5`m5oee)X1hbTɝ9{u Hm-Tig8ORyHNG4Ðx%*G;ΦYY߁EyE\ݵW+_$SdM#lp_¦h 5vE,_Ȳ=XK9制U2W#)4fj-j9VP-GF1 4B%㲕F'&K-'!kbP`Kh.% K68\dl2#@Nc0Φ8͇6;_^^Wai*IhVn,]2&YmJ4o3 @n=݅}fWtcI$̮mN^ #`Irw3%WRȍ]|:~仛Ξ=yPϛ]ݜ:-x,L$ӇoxvW  .&֜-vf?] ۦvoAqݟvhT_ܪmBJm|Y$+x M zEsz־QƧ(kѠ_mӎ^_\Mw]mpbGj(Zz!q5k+u1k!4Pk2<ʥZѫd9j;u10 {.CIʜ %AJ &KϒQxFPubl:[|~@^~}B!2~lhk%ʏ.*<k ѳZ +^rSk|89Y%1>Hb4$xM%G+r3DN: sIh^8亇fM3wYš$2GeAXmWJsMsK< B2R l'p{< E6EU+[t9 G jቻFA\9M,TQ\ _GXma)f'5x 2GR SRm%9zY|ԭ =bLޛpzzaYќ b.3[R``Adxbqp5cdpUkj^]c1JϠOR?ktҥBr1z2GRt\9}ޙ"ɯ7g 6{ N@Ͻ{d}h[#d=q%;:FJ.yƤϏ5kd+]>2O̙5ZHO`-ѭAi8.|o՗;U&[Zb_&W~j/M?x̾c|20gJwG'Qox1zܦg'wocFnHJ?bDwTVLr9S*_R"2;Gw19{>T{!Nuprg$$"%~ݗO_ w8Œr n)a 1jho6Vv)\q<.iJ9*r3=±}*$1Z/Ghq`sK #I8Wb|<(6fL.^LFicJ|&F{fڥiC8xZ#,k4ZzTxd,ʄ,=s>H:Tix≥iEǥi_w99%!bޢK/LqR(_Kߨ s5j3,4ѰLYPRl!IWOh\W}R m䦜 Jbsdx9Jx58Q?ѹ@@!S25|'dY@۹ɊfJ [wuΦBYăŕKZi;oo}U#"7Zʭ~x煠uΡO}77˄;;#S&!g'Ѝ62jV>:g`1%1BFrT!+>A,)t Ph'"L%j "7HB#sPlSTJĴ{NGN꺎u"}n7aE![[0곋 «*GZQ;<&g%ﮠT(ábUb>z) Uk) U.XRqQ&$ YZNVNְIU'?wlufAF'8%"lbS$ML5t{ʃdns^*!#8g!-0@Q b>@lw?{Yה55{f`񊮆i|Տf~֢ xs_hͱBGR1{CGC"1 MjNk}jmQJEՌ.]$V MI4J2lժI|rBڼ~'Bmںu/ޟyDYc)$KPP<~tzsJolzH˂*ޘ+ LmQm"rVɻuuxq3\QP Nkż  ˇ5gEmR ZVH%[B*}Dbb&6{:LkG'Z0oF?_?sb<CwtyEgzH_)ev]Rck{ t7nFG4&%bFiJU_DGr' lmfC4,m`7 zU- ^r p5&UԈH@HHl rO W!lY[ :)P#8e8mB4I]zfë`0~ۋ㷈go>}#dJZXҺ\%[IVҺn%[IOlMcZd>gxT5qӹσ6wc)1njJm"q͋AIQ`X듏QORSYT1--Iɔ\)IoSˀ9=0"ALjM<(Á!|dFzjD"61l:9wg3œbԻn'qb;8L}9ڡ.>lҦ=v qy:׻ >9op(Rh(Xj#V jI{ST!^rƶcl:9Ah8LʰΞX*Wy?SLؗE܏f ߃N6*IAz!jVD6:xXzU4Ǣ9( >xX 4IAìu2q9M8I(@B0Ъ[qk9]mԙy(-l;_q̭E[ ~v iN?mN*i4oK},^"ͭ{ͩ)e^D-5NHr|ERX[ Uѳ:B ]oSe'oz,{veҖtˍK[ؙ[֓L e [p'xЂp/kU4`V)C( e;51< (*18 s€;m<8v(Cd".R0$@&ΦaFde`y&=_ڔR.Ǜ!GecO[=+<|g]nr}s4K}U%2\`$P#4"5!| !$h?}j[13%#SJAi oT!GD%brdžgjB& x_nBp=с, U%olF58l?2AL8_6PYܡ/a$,[}nm]Mg#ژ1ysǫ˳7d D&~ Rwz6[n{`$WMf7H~k dH, #}^6 [>,dF>]̧pпnx>fsrQY=tF]Véڦ4Ɍ拘L,TBل囸b_ԏƉllw$1c:`ADL ~P5Z' TAq 6PO{kT |;DÁוk0B+>)Er:!gʌ&:ʥN':xyuq(#;'deOt}{,5Kw=Ui?KLqM_wq]Ƶ;@uּK\c5IIOoT|W0a\w'ܙA}rJD */47!P)8BLb 빱1$yM*Ld3⩣FksN_:GMg=8+)Kr)bh>cWxo% p#*u/ *u/J݋RԽ(u/J݋v}0O{~\~0E8j&g5Hk DbLG(d)j4[MPFQ,ra%^+AE&P$F9kqԆwZ kPB׷$w4Vq Ϟ]6$[A$y@=AL)T[NsQ& ,s)7ϭi#dG\d<#~>f9~a<]t+Ex{?[N>-tgDq8ñ_*AX_[/ F@L b륒2&T.'S7QXC*)Q$h>CQ 5*NѾcgx:j\ ;o;'B~Odm:#oxԚSFv$cxW=LJKJ,e"nRBrha!PDKhcƠSbTrKPHu-;zɸEvFOcYȋ,ܪ,*[-ᆉ7cVzx' j{/c!60`e%"(pb&&108|L+ NWY/pCڝQS6t,H`UIHeӂ*(y̩B$^4j}ŜXģ]ب_IXT %*1q]6JOg:uDz-_Yl%s7ynu^UxWN>JMQGjװO$;:mqFxer${oqUe-cH"i4Wօ wt7{.W+f'mսcxMloS[{lnyJm4s֓Yg&lƼm+e6/.%_gVw ^V7 [q厦}^W=ScC-wxY .8 $Rs] ؐ|<P( !;rwR޽w_[}5ٕjwjmAZ{BZ[CZȰa?NQV[V!f4pɃ0i8])B.RG OL3ub"eNE]Pw֋\,%`-QKFGυV()55&3BDN H}0ky3iaG8N~%[޵۶68  <"57-Lzq!?VnSRTo۬#1{'^'? /'M(~GԔ՜Vp9RPIRo?}݊.oJ6vнdgQ0'/jK5'ë$x/G9뢰u2>%iZ_2Wo_2A C%0P%Z3+a`ų^2p^^v_˙?$I/3 lIyD^{^Ìa)MlR[^b$IQIˀ/2ŽIoVyoy҇In`gIf_~72-"AV#CYe\9[%޼> 1(8ҁF)#IBEޙ9>q!O:S G@p<xƋCƶYgmLNfj6ӛ}^ow귛q E}?K?HoDDEܦ>mSm}QB)d4сN*Tx*I 6L3,CpǵrJ!c!sw߹;~mro6af8C]sw߹;PodVv4|ERX[ PYNdnؑ7w`G܁'`#0G}/Ë=v%d'jyvJk4(aq)$"PATDMg-U`䅲Eډt5JQ#e̜0Nj"2Zmh (^l;[_#j'v>; E+,yIsLuCPDozc[3zDWsׯO//ƧUTIͨ6BV'~Q|kɀ.mB9q#N>¨YػLlx61kS?g\20{Cы\?4蟝Ζ۳]2W^ Ҷ3+XE2},_H!G5̧h8vErUY!Vm}Vé\1/xF^Zp՟o}saěy t-d%Fo &/LѺ[ҝf/ESn'MiR/3SzB{?=TTMdϑHLӏcOo?zOOݧ?E9|/g\̓'m@'l?KRKz󥹆-VS7ͺ#"%V/BB41iNo 9,`YXE*ӽ*I@s%M6*q2e+$aBՈj$"SŁ؄L= 蔤6̹KUc8yb ZI)B5@稄ĝ ߕ|ϔCuXl}9>ws&krł7Fs` {V pJYYR9wë^9__5@͜WqߺW|^v?y,yd*@5>Qi"2HyAJS ĩv!ă2"_fdyV4@$b_k1xl!3qWϛogt<_7xV8l@U'tg *{uy>ȼFsKo'V:q\Ut/*FR԰j"ygZ^l4+ .吋 Km`QgjܡCE6it91%G7Vo Spl8htBA߽X D-:E+:@J"D:JԸTV䃪PAԬ҉Bet2QX9iiG9 '$(MRD9,GmzL]ݝ"uwCWw87L-jkm*ynkswgg⵾M >:y s 3cuˬ%RgH2"Rde9*/<-C!- P&SZ8o!qkMrQ"8dy(XZ|;+۽kH[~E,{-g$|A$c:F1'Ż\]I*RkC ZlP eXn9[-GpH7qFuӉ"ڗ*)$&u,PT%C1bdbPsgx|fȋ*[+o.C֜2ʴ#P~7ZH2)-w@(y#ZV)rПE!C½!>S!ǂAV R6!*mg3g\R ݌]y!/ y ]^e-;23hz3Mß `02?Yg. H<FXYE4$B$I6M艋`kpǐ?}028 A1TBuh܈L>&.)ۍaI<]vʵ0׆kwvΆ*`{XhH J*+e$5Ih*`uZ-Lв0 wZP@ !Ԣ= yMKFұ;dBxT if~X+I GӗrDQ##vH|TTjk <(`)H'QRʠ2Px<ڷ8^Ҏ^4( zVc]\u۽o]^C`~}}bNQ_ǭ=ܯ{.fW+'l||γc$%^MG7GI7{<0@i{]BFgyճդ#6ZiQne,<..kQ #=ͺ:=0auDDqkh]+ZҫɢtӦ%RKۼyo7YHږqA/;&.wqݚb.t]:e]1잟t~ƓuCbef3p[jR9 -U-"ᘰ63樔4mx؛-Χx) uD tҡ]r,t I ( vo!Iw@]R-Fa[g8$E EJn13Uu E0&F$0I&)58wΞ8Exr{*)7:F[tkqcM!wήGrH\,;gY.{˞gY.{!fS h.{Qg.{˞gY.{˞gY.{˞͹Y.{˞gy-erٳ\,=e2erٳ\,=+b.Ɣ˞eI˒˞gY.{:iB8X324|Bɕe.ẒO؁CbR%c6(SZ8SLQD3Q `YR1# E Tvǀ1T)b+t@Hmd*@SJx-[l~ ]CЕPZNҒ|$kX5PS:3jM=͜ݜ=%</7j&n\Rq БO}b`@}]d|c̱r<$i/9$QK-9+aG"4a@~҃k^ }_\aGxew0}04ֆbI0JfoPf ~\jAߧ0~13X4LrgWC^}nѿ[e]µgkq7P6C-XK-lKpjV!Ց RR7!"mFš]*$TɂĔ#"ᷠy`#ۛP\^u1YJRG Ɲ$ʸ1XSsu RBkgʥ1IM[i%,C0Z# Ty7"xFPV)) z{!6f4Ż]M/mK(vIYo~s$)ɺ7!}+CZQ#KKP,-^q$Ϧ%d'r\ I>) 78A~rڧ(0 kY,7Mk^zsUw=Y2Hf,-)Yg =D exBէxlo׮v.ָ$Ch8rʭJNnӇærOƐi\{^>A# 5Ƀjcі69H p;}nDlqk[g:RjKf(Ld)Y2S7CK [r– [6]߲geqHHHKR*hp(+.5IK-c1OJtQ~RM2`K.-tʠ2zQ>c={͐2$Up;oB!Ƅ3@cĜE NVFV7X>ǽmԾ'=AQtBf_.?xL"aPHPO1Ļz7]6+׽ 7~E1Y"x̕iQ2Ku%e *F9S4W8TF r-)gxTc}B bsX*kX:dԃx C] H$wX-8QP>PbʏL΃#O)5otX7C-/d|ŕ{`W2z-l"itT+ZJ8Fõ<:LjQy`l)JЕUD1ZleF͝Q@ F刊` ;A UtҾ] `[|ޝZ;iW\S%YIԂ3#%ϛ4 Eec% *bVj+جl΅5Ǭ9ȭ;>051ƊH9*`@XE.`Hx0%AY5!.!x030{-#cR1쌜5Ǧ }}s; ZO[j3ft twDn$`Oԕ&+WjtdD:MLOF,:b"b^󇻈Ξf~H_{{ӻ#Y߁AK4FwYɅDžQ'bkN?}tlS}ii]BsqݟBwWJU_ܲn~\`8\7WEЂ,.'aM8b Q"zM~H=xNrr9 ~G*w=ev>ڍKV-k/C^qK-vGTÚ%SFJaAXģtjb@:RbP1!SXa*5(K)µmz)z<9;f|~BVf}lz!RJZajm^Ԑ]lZ^rQ l&w9Nrp "#g < 8IǚGp`b15OV_;o'»E`v8R3bn^?^>]sK|nDEi2³&:0z"'t iuY"|?'4i=ٻ7 NJ^lub1B=7[3RלҏOoȥ|xwW:Ybvo}_G `xSAdvMTEwVTr+24(XbPM ԍn/AHC?>\`.ﻋ>'x/RTf+ $(뻟еko5Ul%r̈́үr-e_XaRn|(1ČfH;>"p_f!=$aFhM<,llɋP-#^ \a`qg7^RLE=r6LW?Np)j#pjR"(; J ÄA» 3&#XN':'335-85-1G;'Dx, :y.U8KǁvizGM9Pޖ,J]*t>`٣3FE8O s"cEbqӺbTIDKVHڐ6HE ȡ Fj*e;Q!(8v0)&8  :H+rzu; ߸Q@+s !ss&_A3uTXYprX4.+Hy݇ YRNGRxB@TJAOT{σ4AtQ-#)}Ԝh ;?|G p*lRXEd*?"\B6xјMKckH;SR\Ϸ.c)>Ą@42X@Hb$ ,p0fE@0Jqa3, 0GA־pŝk ́G2R`Y8EJ/>-/;0z sZD)Z¦eKk,)Q#мs'XdeܧG=s*R Qe:X@]‰!+-'8XC9&YG92t ȀRA-EbD h캜gglS)%3ΠuHT7ޞdCJ偳ltb[S/Z\ fs$ShXЧZwxw^gʯ G !QQ4AR;ќ 7nt'n1F$JfRE/cZxd ,bUdc(VѡS wPPGCǀ7‭iU"DƽuH Ìp4v ;#g;l{9xV|no6.Ԛnm }('"l[0u1EqJ9^m;^ɼZZ?qnC`B?IBQ/[ظsҏG)z0 C Ӝ6[yU{ G޼7o~ 7c=@]h2 #'_~Qbibߎb[`i=:AZm#I*J`2Wa]:3:I#E2:%=jo4v1qTʅざVŐ":a vWUJ߫׵F? U> Wpooxnus痪z贃鋚-dո5tPǢΨ{ wx3h{}^Hephv<6l']IVh4IUxPIɜ; F`ߧw]4 4fթ|/dC-ïDh#N[ w@QDȠ}NigRiW,HUTI`3<#JƜ 9X\* b|jQ$ͪ"hj*uoD $Y1ڢDwAGo`]oAÔp26ZZ壝);|NIbѥ`D)Ҩ5z5ɞ5)0sS҈gE¥:jtALڳRȮB&A ٥fّ#"L3R/3*1c5.D)5>#OAK0ACj)h\7cs^yۼ9e1or^.c#f @ztqt$FOB=V`&ۖh DiVۆnME^k Q^y2[=y`4vMfcN`QBϟrR1hXh'r%5 2P22 Q%fdiOdPH;@qoڬGdR4v"*TO[W6b]%r;XIy/aL;J-*FjQDFb1;L ;CTXD t أt%h#*`hAgp `NmSƚ^V0s0Hv -8֤Y 6)j@%HhϤyha3\Bv Zx:-}x *UaLMfFF:8FD+(|t* FWP '=ʤh_H,O30%7aFD,St1 L#i­_ujE#Vn˥!`PrW 0&ʥ"0xwNq$,X m\];S9зL0@5G w~9ONζ I2ިB #gANq֟SC {ҋMY]F^d-D\&#│Ɛ'b@H D;7J 2ׇ}QѼt%mV}J @b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J?H#}RG jo@VFkTJ V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%׫ژz' ~@07QGʗN X"+F%tb%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V}J Z[Q^6V}J(b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V}=JG[wW?)zym_]Zչrr>?{$*p}.!I?%`/^+}` \6xȹ= W0XYK"k_V=Wo$[Nں6Y@{Ã[w |W-0e=Pso98=9[] bY7vd 5 =3 iǷ;g28*aJ RdyTi09L?vB= W0H7v_Y^|"+}p5+/ %]}p/ÃM|꫃'oJۤnH䢆rv5kKjsIq'Ft,:]l1랺GfU\o)]7S(c<|ѿվ򆿞go1}.}5Bu1*`~:t{%5іx#9>n;̿?:?t d7{0@憸/ ։Jexa+\ i~ }@3/-M%N?Mb/]55-5ۋT 9S81 Kfg)vÃYR׏'3ݟmvrtPܣHlq E1l{;=Z~vԤkk5??:9?|h[+` Gu?r;XVEpOrRs?6֡_͏_'ÿwpk~.nopH4;Bb_\\`:=[h2N]uGm+T RVL|< 889G>4`jk]V9&̜+%mo-{V1eZFY#1z-]x-{-7sdub\@pt o -5GA(z:o1Ʒ:Tyc0:u޵6㏴YM!#dVӮ Ծ,OXZ\f[]mR{V`xƃV`r%tr3Jz[;>1/UX 䍆j2ՖU;:.tRGŴg&(;Y̹{#NHy6Or9ܵIɕ kNu#CHٛ~'͜ݡ4g-q'Z<8Y\9Z¢7wzn!h*xh1j 7x Wb P]K6BdzViWxÄq2sgycc8j_4^ZDm;;7MZ}IůPgvyQwwꂁ^hS'عgj{8_!eq/oCɮAЯ#THI|~3C8$-D̙fwMOWSUҹ(hTD#,p2 $B$I6M1+/4NCl``e%"( t3A%46ABd%ҥeJYb8,"Jm(,=]k61h.ߛAQHjT\zIHeӂZȈ )<*5!/YdMKPFBF9amOt: 1 ƽKJDQX"^"q1D1ɂ9%\{,J+ea({ŽYu L8ʸ&}-u7bcucM:o(>oG}n‡r4W?\% rD Vш>ʯOKI\.BR |ZTs:҄HC\Mxck|z%Ib_o{fGJ Zʡ4_;Ѹ=#STXrʁZHD\N-=?WM'Q5|b[Sg}?f (}a0v fלAo9̃H$ɦqlzC`|FCb[\_ svgK+ٗߧb6adY7=%w:59}~3atJF[[Zb}7"[˗3{swyϲ9wX=S0S:zx*%*v*DWҰy6 m ym Q&,^Rn=x/Vm[hM-ǑERڿBCu3:{ k&fIhZuRݭwl6mn{[[3ۆLR{h~FcVCΰM؎ysTʬY&9˖ެh.z픅JK:Qk9쬉6;^xnCz7f1o!I+NjKz sFb$)e% 9) %E xyu/COzZ6bL=K!? Cwjo(|}q]}`/wϝe8kdelHUV6Twm]:埫/ۗXw+߃ohz`mv(!ɳ!-dd(Mʝ/9{*-@ L)5DWBS9LٳSQB]2tXFsJJM$*@FLS Rpc)X z}ڨ[' XGy|s_];mn C]A=m~6*$OI&2S dיVի!MMݛ;xp*xx><` N䏳_r師i|/8F=/s4$*9EM$)ˣ(0k>r`xٽkBB "/J'#gͣp" AG3 G} fATDM(g-U`s" !ҡEP)c$pwxpV[8Qr8c14CB2?bQrI2%C_jř!#~{s!yU1wMdZfBrz iJ# Y.* 4+Nͥ< H{MsIp𥵎*?*6B6. M$.gJFh+x`!hEB!J.2(zADZAY^f*{%po)),n*U(A$yb g忓?~jچH" PoWRI5qK%1'5a'B.{F*H u0,H6j $rLaD ,ܱDDV:3!v^=xkuva} x޳so%??Njo͎;]*~o8v|q7y=/;8)!tஆx8 쪟e'Q@crŽ.'%9,Q7,J8j=!Ht_q5[npކ:+(/3s 1 ,ПY\Z-nBo__/_Qm Vpy ג]<4qz4w¨l o;Զ4~]u6w>7 YZ|2'zv;|xdu]v>T.F/dFZp৫ߍÙnG6ꝶwYc60g2=o]tRu˺[ӄo|5IdavL`{uJ\t67~xqL'?N??|zw2NIy2 >)@ڔǛ5l1%m%cܿ% ,joy#f#JCOw5,kh`Hbj:`ADL ^LhQAw*ؠ8P'ކtj=?[-IJRP+>)EsQr2㰉~SȦ^.&y>Sy)[;mi[ټ5n3lC]{0W>r7hn}wUiAkJ;Ou>s7M'D\ D3 9=$UQT>5XN%1* H, QM)Xh-5Bc{cl!onCw@[ /jgPܑ;E2U_|0.odŴ*-W9XEFgyJV%O棼I'?rάu$A9䤛&:D]YN)ƔHXqhGGi, BPXD)IBT&@JAxF<[O9?LIEx*9})G yΠ>xH8_+fc.j%CAId~YBjb~7t dBd%F-* *kH]ؿk jSѦ.+4(MB1Bbdt {02!BA$Pɓ(P$LH]Ou42|T$T fN* ʉr-rSQ8|> Di!Yd)PPQh`UYUiBScp@9hc)K$)'@PȭN*daY--F6|lrzS'+7:dfsKS}6k{{e*:infndUn]O..>RSۚȵSg]] w{=ӳ#$|;X [Wغ~ܺjZs<͆w#O 7 heUJ$k&nCXWWWWWσ'RWW{=e>p:Ke&v=x߸ -oAW# FkN?lopkĮk-BVc|~8ݸ>i1]<.V-d*,]Y߹Bp44".!Ҩ|xۧ`Ae7eJ$nuٻ^E!5XxZ"u/4AR鄡?3$OC99s@48\@)(S ?*(<"wtG)˧yQg\2IRa6 4 EZ$IhHE;_wRH .lE@EoX@]~#b 2AivdI5v"%v{z@-1FMX B|,끟 @Aۦf?& pH3*]բED}ꪆI|07co'Wi;ΌAWjT'[{snL+?fLt9L0+(^/F^h_QX(]}TTP2㺸akց@g;@K$PzGmOb0쪔M=̷t U,m:ިJ ^r-" `BK 1imw1yjq{)X0taΜXLjAq gJ R%p= UVSR%(T_a 5?j֋Dߏ2_ZŖw,1lF<;x xm=ٍ?S?UZ4^;jqT]޶^FYw>b[ޱϚuĺhOpJt@qUzYMo~xrٓDbn)8O#?E|o`b? ⪅.j+ddэ> M={.B6 r#.JtY"Bq $ͤP6wɁ5 v=R+N#IyYiWq%3sy(~b__UwSQ4oD!Ŏo6<`t,KW+"ӶdK־;y,#X,WeLn`NrDZvf+q~EMǝ+ގi[N#t>yeq \"%1 A{><'$(7p3t֦(X}05p<߆|J"L( ʬ# o) 1hBO {ENVRKy)v1D%Dި(0}Ct }jT{?~?M&*o1̨lZƼ7l_?.@\wC?U0o >>ВB(#"'wF+(Xh?t "jFXX=InXq GYրI΄ ;  hM(2o$ Q=U3ԉŠwe/V|~Qqbִz* ?W3¬Oo3 ZRNCqVdDo7NtT*@{Y tӞIK7REqVjޠG EέRȰ g -J)!RHD#5͗|#;qm+ӠeQZ{@Ji >4xJXgȰ$i01oP(֢#'8>jK >]-|b0DR%c6p%n+Uda6W̲ŧ˼ ReI/sͷ#z0>u{F" iE ⩵Fb6!^a028d1m А? 8%IlRyI=t$L "*N.p%}ù<:,f-؝>Q,!#}daNju ę7;K6M~` 畇 iU32 hGYrd|`#NudiS? Ї$OƧEJDY"^"q' Ej"*:,h*/)XD RrkM2Qψd(𚐘zHLrz0cXx҄I s-DND p⼨bd_(2E^.Ckzx x*0u+efy(Ga2kͿc#[gq ~ҙUD!+O|dN%bgTߝlLK}wVSNPJwݔ1ќ){{CF/WT o1C7bRc6(SX8SW^-`GߟyWiuJLܦGnSk{>NŽ}6`Y"xiQ0Ku%E *F9Sq;ZShqPׂ%B bsX*kX:d{ C]CH0ZpHDA@e Ք28Ϯd?>hx[Tn3*oF C[8_Oۺ@T1!~P/лhA ёRh9*Z V3RBF偰Tg`UAc,6ʌ ;BL 5T!7dzdu?|wutm=&=uOQMHWkyR tFQZdTWmNt2j~2W81+Gw_̔F,q`ABE >2 lk۷n]}T]ZsxAX:J^cSe ud@ x$ ^[)h՘IDk1>,5[!-lt tU}K Ƌٗrf׵.5~iSRj  WEeo]ܲfk5۲t/=.5pEk@j+&mv\L.e)uzXH߭{+6pⶹ.>Y$+7M)V}5wyۉ{֋,\ZFw/?i#w^NI q^Ɨ5Vҡ,6<>JD}wmG,.r3 &/;0e$Y~ݒնYfOUtۿ6]tib/zEqjX)׀bиD-|sѭӏ{~ػ#H!z#ɗjcupFKtkE,A!ģqQ12Ry2brgl 茱<:H32>HBɹB(ɘ3ң :RcZ}$r,B!2İ{eIF;|{vn]2+^3\/h]6MR G֒l"8I8JJR FȭxT2 cr$ou )A;hf=6K0: c dd,g[4҈Hhd/R}l'M4NNzj͗[_k afQ* 3ZB01JP3|2WPgS-M(A% 0N 0F{$$J/HKsp hT_EzU6Z.Ze r 0)LV0E\K/2c<1͸J(=՚.|,Q2W.z=k YƛJ_kyH.5=\R!WYtdrn^~ۇGz`*wo$ ?|/E$w8H@ LQ{K&RrtN`:#xzF_[shV`=w]bV{E`_\˺ZVxE6 Gmw||TfPϊߖc3S86Jw1] ]kD;fۼ ~}8;=?QÍF*]VwQ?{4`|I'|5*&6L%H_+ o/f'f cZaNq OzloM 0'aWNWu#Vw`e77'Rژg},XVj1>ٺӫ{l}\ge-(]+)xqn8 jA׆KZ ˊQ8+zit6¼R]l2Ѥ$Ycv{75ikl8.Od$˿_x}x?{PW`wᗝj>~m_]S%]S|~G^_nKnp[ⷲ3]ڑ݀ce'K:/GėQ-&$amrp"׹D43F1kPLoFszVP6,KӪqsFm Պ;%QBdG%}1 MNcjFSoY![2GTeȬ\bJ dᙏ)GTƵg/Ų[ o˩8 u.jg4փ!rcJL@V""=DoN h4O@E + uJ)Ò d2[1T^q"tRYoC))*)mCxi rU2.!*֚ {blV59|vBBHkS}](1)/< SNL1"C&;Iӣ*r|Cw=֫}y:VZ#L&MZ$F "@reB0J;T#n "gе]U g x%|V*~_-꤇`4AE7yѰL Cr#hC:)ϱCn.8%m򄺢ps@ jE7X-Nk(@*y.Hd]2p$ lÐ Ӈ{g3khu+icLbJ},WLF\[̰/<;ibNw\]dnRӸb+yt-=b.+#Ĕ`&2I9GXqs`I^N Ury>rlμGg}*P;p*¥Q3MPpnsS!x;97.םJnm }rZ c#xLwvMSUUī+4;*yԭgZIlH}/j}K+y}M?\d!}c:=\$0錴d$Ӎ?|< ZݤJ3ҋV2$IVzxZH-qxZdu_y胖^>(D'Avo W%y9,^ӵ%кچ~׃oi]W~]Nj/jqup^.&ig|_Sw?PF*/:A&7{x~:R٧a)Õ~ ޿{} s=Ýϣ#5n.ɨpB xidR491 ~^7^{.䐌灑 z&dR[;22- \R-&0&iz/ 102,9\X ]auI S[d]2={ݽe>ĻR7}\}؆n.9it׼ȷ_ B=3f)W4Q%씍\-zzDm2FƠ4Xʓ:Yε$a*8$ }>cS8B-)U]yЗ85^ lW-}#^rVmk_Qa߳_O=x!!]3 izszbYD- i T9T2bB9fe#C_ϑUVY uܦ=pI":.2њe.D kw&Ζcg m>a<\<^+*Ǯ8ڜg[׷hp,^|5Ӧm2*}g糶MC:kE,9wo$ `չtwCw_n7O9nhtyφc ~h ҝ gͮZysm-ZqP-\?6s< rnhx\G1qS)ZN/nݶf=O黏QYݟwCܪoL᨟BF(u<}gJ` ]5(@uw2TeKqż ѕ9=C%eh:L>z‚Fi0I ^ĽsrDqdzɀ6jQ! rp}iolO|gTG9ZO\ v7&0ݼuf4?SűgBGAGCGq44cc(A@f69l4oHbcV$A puc'kͱی!,Á~fȷhK&ͥ{Gtmzմbk}Ift_Y/R} M9(ǪW߽?.+Ëӑr{K7B]X R~ATQa`rBMu/jނSk RUC8vZlT8GǗ=dm h B搘 N LPSM !{MIEF|$"F+"a!p{\Nd[)&(5iS*:'~gҴ. >*|&iobzܨMz*}9V! T˨'yDc}Rl=-bұXQ Ewܻ)B7Xk狁>ZrWEYA23KMޟQ7w9E5( ښbZ,T*iN\LSQ$Mf`V{I) Ie姘<$7FH%ɱ*zc7gIkO(}8YyUUrʁ/{ʛ eB\p%r"ߴ[rgr2`psq6&ǘ}Is`$ Ye9ʋRĜ5qe2R}[(Ylao3㡶Pl ej o.J4~[LaZxw74n4: 4-Ϣ&p#51Ͳ IIY`mᰒd(!(fSjBP5D#l4=6 $!e 3җ8-?XfC6lZ`l䯒&CBd΀  f mZìq@2!CE ٚX" $ Q19ɸ)zg=lr\dXhýgEq䋒#eƣAzAϩK*I(@]-^b2_zL 8)kp ?{ƭna XtMqۢhR\`qO[,Ľ}gFdeydI $q4 y;|$&:1,!t"i$mIY/t\7Zg[^).qQd\̸YWTq@,Ӗ[*XJ d<|] QI'(pmakޱ)ʖPn[3\klW\cٷvySN~Z Ӷq'„-dRbJ[<^v% 89BH.Q LRT8q'2.xL1ִݠKp3"5SDcK)b΂wDðqm8i͜0q߳s&fv-Ț׻++kL*2Gg#'_T _ŤJ\[i%,C0Z# Ty΅yPVO6m wJGcs!h\ny N%828i~nCy'r?Ra7idRl}Mje$Ghi'Y !5A\2"l/7n!AƂ5s$6S91*lt>#Cp>/KϭY,I}z7>.5_VXآ$uV yk 2&_%s()D&('3&l 2#O;.}f '#[CN(<1(T"pm#9@BEKy}`@ eiy_7Q AOdq>a-/4(zl4 0sPNz£  M;+y/b]AZeԳHQۑ]Ǯ6p,X-6@< *Bay5xiJF+t(rxw gQ*9e|n$"Xeo $ݫ+On--%'%]Ϳy$BKfxe rzO -oIp35Cm>+h n٤P؂jJ=C}EePƯ7x6a&>@/јp47xKH2 rzod8/-2K]3<1!T@9eҸE扭; {^ /(蚓(p0J0wntz Û0uL_R*jjqV@/y0no%EkqI{UEx(3^O];:tk+OfE.¨5q-V0#*)dFR4=aLs^/[wS1sɷ69^ sK?!QlJa>W ^ wE%@| *l9Eka*qaǝ2nxn?|zʥ 3>Tt(f \ |Dw-_p=mOklgpa?J\nwd"ku8G?5Μ:f+Eռ)_ \F_}Y+ܸc:2+o,e.0M9jUsČ9_8:ѷ 9*Zy}'Ц,'[\WF4˷Y=̾0[TРsU~WMlE]is*_-d7nPcQDE4/l1B{^o,)h&:ͻ "CxaAa0>jRZ[Ƣc\J!c`I Œ9˔2ʠrħua;t IUp;oB!a cF@1b"iE:$>턆O= D/^,z{p/lj,03]/gK44t$ #D1E`&2#|0i}L +LKJ \%s_][+bp W_\iz!v54*ԧ̸0.}sۭq?o8_WzciĔN]DOb67iE2xk[Pp5YjRoW]e?LS[ ŧbewqZMEU^?!D#@SDDSbE=f:;S9?B)9G9L?`w 3W|70,+103Loc1>V7er: pMl \R &kwp+1\Bp%\"E4|8\0KUV}d&F|Ύ{~_oߜ6ιh3a-}2؇u zU.:RjXirM ЕkJ|oFKDIͥ3l-g,_cZN*8XswRY;K}g7J.kU 8'U8S~S/-hg#>o\A7NWwWRWٜ ̨?8y7+?h,L|ɸm]ּOj~(FqS%C 7:OVk:8 y ;6I`7"(5p4aM C:"2i8rA-d~V [dHKTָ|uxc\[&Mz7 4>|kxpIpκ)}IiťyɫP4YToz1ct} DT< $0eUaXA >)"7B J1uf8 ׶{<^.k9P& -,n*7W:7Qevjwq<(F|ŧI7>@ }VN}qc*JTn NOC5X E\nRpqOcaxh-0}JFTܳs-{8-!uj f[sJ??~ ]JF?N;]-]a_ i-@8ui3,\SB0Z [N?K|; E? N1.RR{]YW_.\ɿHp;ǏߟÇsLx]ko9+ .0mQŇ`L ^,pg"XINƳnIcv,jVYS4@eBaY+X6@\ii[ijoٴ4pԋK#i풅 $ք,Vvd=>h2]LeYyhoxo|N~u̫\ L7&+@袖\&' KҴg#=@X² CTZ3nd$9!*K(w3%yE*: SmLܳ.\F3ug,[2x蒹t 'Itgi7[sRF>GQ&`jY!ݙz3GUڙeȌ.1D,<10tdH%o@"Qh)%u7l+ E崕zw 6v;0ЁЁ2GJ4xTѹ4tqen:y-?.gp:w>چ;ћPg7n8󶭚l`gG]Qetz4>50zQyڱ2]9F茌Jԓq٥z-ׯ8f],5)Γ'>W0o@qAHG8 \~OTY \Iz֔)#a~h/#~$ؐR[.bzܤ'AY?` O`2ɐA4' V$&|uN ԙPStD}ЂI<MqX$sڗD`KmsѺPu֛>P=Y F@|=4Ieu ”Jn_R)Ϩ!-QؒVƪ*j(\df#w巘<$$ZH$A5PYQs6U`)/ʔf8'TOzU־^7GW*߾{I&c$\ϒ6dM36Weo((<0䤑`8\(s{f$`b&`9KL+Eŕt#LFe&W*laq_[(+B9­K?]/Rw gg0w~wA&ϓl[lER`,G(w[L4f.Ge䓦m `2H" )ٔxQ u&ߎXD2 ^% .f_vq_ 6 V{@7l䯒&CBd΀ 9G.ی+֫(׵v@2!CEJXP" Cp3c9I;WVv{:etS1kZDlqEq+ENZ2H퍴AzԀQ$ *%,^bj-Li 8!H5D.vG )cR\y$AP8[jOvq,t߫XgQr_*E5.Hdޛ¡i,W6Xނ$/rR0(9ef ]܆] zq_{+C}?{xpfgy:l;27إQَwH'Jg@ƷEd0 i6>x]! CM#ȼ DWd,K’I\V#|@.TsT)JJM mQ<8c+?=Jc59o [v.^:q5Ue#N:9QIZ9{}R 0D(9N7Ft ƥ lrT wɷ |5y5JevxEdiQKBg! q3շf{kC*k* Y^if}tGig.$jTG?>I@d} rC@ LL잰ZIeuY:ٜ nfqn!s@* %RhI׋kY؊Ϩ`=(#(͋۱)6J{:ֈWAɥ8oõ2jo>;]EM.рJգMӿNiX0<~tWKlCuug^'?.̮ c21>ʓ]J{޷Dק] UN3`l]K®m 5#7)mLGǭ},XfٲgmNoU[]>V>\Zm8lFi|?z}xu[Ѵ1?W{2cvGq=FaqR,;2Ѡx81|tܭKWo7THf(O>@F募^,׿:x8x9̓?{P׊`7᷍PWhj[6- ܡi-k>Үkrda) 5!F/ 1YL <_~YV&N\ڃv >-w$ k ɐK{Q̼%ʠqcd.j e2ap>ڰ.M>1s#*S!k͸WʎaMg.K2:Uufi~zқU˸j_z%<ՖV޻zt&}(}L dm69',MղBKzD*Q~w<2K Q( | R 4.X6E̚Z o.!F`F.íi+`y嘒1 ỳB =D2&(dz' xP4C)"jH@% wl2 u%'rKLAv Mʆ$RkO4<ƈIdhVViI#Ӆ4x#T /CUi-sdJL otFg2izt#[G{H)W}y:VI%\ʑ+&"-s \*!AB4c1 둎MS< Gе']Nk* x}0h>R8ݰы/&GB0jtj1I7yѰL3MC1<ȇfx@n*qI *E'y x%DB}bx>t73')nQhIp>}.K>+4?(y?h~#kH- /_i?t˚g;o+7{w+tu:f ^)&Iȧ@ճv^$ >w zv; tс U'׺t: ]\zwK̡:y-?.ѹomڝ?Mw6i|qmK(l`gG }g~:=uwxkr 6pevɑ;Y%HاyeWm\vw͒))eg]rRT0I 'xѼTv%"7:jkZ2gm).a=2cR{\i^z}/?CY& {N7yzYTm?pX%0Seӹr5o.RLÛGGy?G 4>3&26ȤhrbA*b李W.䐌oFrD2)H[;22- \-&VҲKS ^F%T%G2+A}VY(E@:0fΞYFޅq4O@g qt<3}'-a=ZZVECl܏iٚwn/R B=#2K4*՜TE[ d4AQ74X*O(bukkwI#Od'o0cS8B-KNY͜3x-x4>MϮv~}mrW^>4Qf,/YZd~4%,3կ qrA̡D:BZ@a:"9'[_7/)ܭwe].ypY;I%%>o<cX /w7k5;47ي7fmo|zр}l`E4=Q<4~^ow96uo~< / u<>C/VX|G%Ic_u^7jikjQ;2B. LWTq0i~Ѭ<޾T L Y B!1#L݅BHK/DyAѱB4IъZxIސ ȫn}]-'Bf?I0AQ۲6nŹw;enA\r1ۂFDZ Q:VUd^CҾc5Y!LAtV(x][^QL2`p4dZdk=MfҘ.MLY$I,!7k<)'dHN6^ h΄UݝNSY#SBgZ!x^}#39|:ZPt!E.9 71IEPXn1R(J pyn9X`n>JѰOj̝Aq16&Vۓ ^@L,]u ƜE4Q!uIcO4v(*1{i<>>mOp?IG;W>ެ\V\בx}44ց>*WF`ȷ8=xltOqicUuvv7R zs]&0cLb)jsW=Z*wI)oYXsSc XA1#{ŧ'|>T]^#{$"灁hY_u݆ #~<' eA &\o4@;s1/ qdLR c?CBOpgog2#prS=˜in7*JLiv+ tY >%ZZūf4x`RY? =cxlA[v{Y{;Yi[vW^jZѦDppc'`JцxڍUbKF7l3B@r*V54.dW:Gfō=[T"'FŸG?> UxIGֲ@DA|ƧU0\fCQң$^TDǍL8ˣ=S9(㪦aP(cK $0gG;tTI$'碤FZ|Q4%-"YSϪ Hd"PMUܞZ7gg: Y>79+7yk.Ϫ|! ch5:wg`t6Nu?wiLLU}̗+y۲^ }{4?9;Z΢=Ofo^wqxjC@^=yǼ}0>3--n9To>͏[f_ YDNœ28^cf$MO43w5<3'7N6u6wc`c7G^~Xwr5г>'{Ub{gZګ^{4>NW:vmeO}3odzT _JkJeϐ~\w:ߦ4՟.WlbTKwVeI>ѧYQUfG. {/5e%]ΡdLv!#*޻QFQFy;#+m>6UjE0cXB6"*= HLBN.}9|:T?׾V\f˫,Ȃ7 ST '1(%Q | 0Zcyq< XkOoڍw++ƠB@JKYΣǙ͘PCsdE*Z4x 1 z+I}dZ)@#3됴R1.-IJ$&[ 0C-tQM nkL8̳vT+݌cgxcyg?/ϪD;DGhyώnR)^fʳYw~/Ԍ7 `BG)4%a_M*jTi=N!+LS ߜ[(_[I o[zV v耿t> ƾ={P탲x >(|PT#1RB1NsYఐK %2I  KԔM/AyL2H$$fo9/(HPLm`aMV̵8R/a}`Ugks˥yv[>u_7>7<Ŭ|]b#Ӿ `z} ulWOB:歪[[S`=.I)RG EDPƾR+{062 ƭa i.gMmЩʙ$U1L1k cZИ@90r#P49A1|Tִ:9`F2 6gORwv~V;R/؎o9I.|/Yy q[A=R(_b4[h4"HIZP)0Z[%(ìmvl.,cQZKY فR@Ҕ dP:*Di3qpлs@$U߂VX뙃Ely~>N'OgF'vn:z!t]+8|HXMZ$Oΰ]QYuuxGR 0o< $mRVvQwq!,񈂿/^w7;h4vd7yK}uFtmtGрOW1j50htq9W VToF̓9VM~n98=n^eǣJvӴ_*0%5o f.&mdƪ fZXjmnzHFE,lM-69'g.읆b} XBEHA+0!dp9PrEfF;TYOA$t"Ŕee+'rqQ3 ):Orb `jQblu5s[bp [=cgtm{7ta29? 3kX1[#`4H& ^TB(uIs'mSkx* 7FUU2h s;(,-)>m%nOF1Ek7Xkà1_eSJ)@Rƚ9o/( {V2>,ƺ" ̐J`tvk(dUV2*QAT Ĥ::e^3qvÞr㚗z:>Nj~yn5SN{8ctrn] wmI jq !T(X'p8{yHrϝHJ5PDuuC+fx{`r>hEՁ/^1whsf:a娠jx`dIXUֳĴigo&U)oT*蚇MG5`ּ,#&W/wŗeLg:VhbA`H.p Ø*i'")B%s{ˤqQ'6GG^S|{5r5(㮉9;A#$` s16J' ބI]$q%J8ś|cozfqc.\AVxd(Į=m> sI!>X/1 k Z(ڜ-rm|`w1&wSI1֙=Ki]|+|Xj~^^jlJa^i Z{1l-zuka*aitU6d~:|mechDKYmXB wu[_o/hϲl !9݀<usGp9g;lM۔Zokڅm^ii,*0AY-飪0\e,:u~㿈&I1tO09sH,c O'AB]OW3S:eXI\(wp)Bp jg>LLj9+)LI}ZO9Y)g3v&Oay޽֐ ϏU668[2PB*(:-uyL$' kpO;&b+eWpCY}V6y{3|>GX|S!X!N}/fa?rs/?ۏZvŝ|f1EA@(K)@arVS LxDvgo*l>H0C¡<[&UE4X߶<_E.˛ gE 2p/'*w2d&OR#wިT0@vYs?4iFYA⤚]]gC~8%5U[<*?gyZRW"ɨD.᧢TJT2ݪW8SHRW@0֧\婨D-SǮ+1d Jbv**QKɱD%kרd: + X1dUcWW]JuRrB ƔJ2q**Q]djQW*|J  :u'c &j8vuS ٪oF] þS6Ul1b\fWsEF9hAX%fOۧ7ٰ?.zX.l;!t R|)M ϾcHގ`^XMoG.S/VQQ)qi&y%Rs>q[eTl{62=oL xؿsX2讠#3@YvH\9,ɷcx8-Xyi<3瀠4\֘" |_5z.Iytp,y!X17Em[޽=^r# $g\Kr1^a&C^ZQ<(.&Oo}v9;Kd7ɥ]1a!8%/'ߎg}EUGo$*j-WhoE0dU"SQWZ]]%*[u*ErENl`Ė ,[};+ @ptQW6iɴkt =zٺ9TiwflWryt>%r`) kB)M3TSlkU[ZymV^[+{TȖT@Z\V^ZymV^[+kk}CP '/ЩKryoBo^>j.<(zӄ)By+X΂"zBZޚKL_}lelZk`ݍѦ;з|^rŖ/Eny" tƽ&X)3`5VKXBE: SR6z x$ PX-H4jX$"﵌ `l5ͭolnT1݂Չa9;lj1ݎZ^8Zպ 7p ŕՕkLo6cvM)yvzKGD$mw_8QL-E΁ b2IoOw Cbu^^=,goiw\͢%n}+ceZaٜ7y@~Awimٞ"z|:7eͶxͶvHy˭u%+!k7dO+"Xwi<آ`0t̹G,7w.`e_N.4# ["'+2X,q=kN)5aKӨ (WKDŽ OA#¨U1kQRk Ǻ@Syְ1r6t1L ~˰Ot\G+|Yq:0 5+Hu+7Fn6,2IraHG$S&,ԒH 15=;%d?+;(;\ ?8ʕ"9{dH&(sH)aHA)܌FvGcPw硫_2gpUtb 對RjT.nw]ONz2GV|}ワj%q% V0b:dG1;wɉQ Z=duUW~V:# )tw=3r}7H:ygbNo=W2Nx7-R.j(Z -hKXN.9廗SS[YW>\gp B'P}x~~o?|pט;82.ג&׽| O[ ƛ M`hIur͸Y4 S%*DEhGAY_.j,/ؼ}>ZD%F Wn$S [FtYA 0iqĝxI2z50.y!>:F3E6`DNpQJMs#DDja a,}D[h}KQddm Ămf s!O֖ou:ϫX"k"M:E.-Um/a ?LϒW^|PD6zNm.9RJ26XyXTa3se戆2sH s"cEbqeŤc'XJo ` FACQ{0rHRK8m0RS-yA)ıH1W0 Opà1rփy5w;H+w h=|!ps.Q$0-h*+8+N8kfµRsQoO!H+)#)/D}^D/ (my7ֿ&N!OmkH9oTn rG/MB1BHbd@SPރX  B' <"p&YfEVՎFx$` `A牖4yi<N;\>&g"sǥy#txXȪm^;bf 8VBpl[%0me3BUSM߱{{x?OGh,.|"%u Fษ5N8 \lj@cZ|Tz*T*Ϣr4L OIHU gR &N &>2#NE:K"@Z)Lc⩳N~LBq>^tF|xn_{׽Uaj{=hYR~N>! ' tĹơTN tDh'@TIJiV6C12J1'hCo0itY31%G/1klK2g ^˳yS_ίAzfz+i̹`SV΄ !;{6,&ȠIvD!2Wt^F)s#Òy,]ltPa@n_E{K \Ėl2$4D@:xg &hYg;:᫸ğ':~z-vPMp>~| i*_$k%9&&jiVuۈ8yX& G]*=bu3:b"?Ⱦm|w] LLC] $u3;u5vaݾkD,[ W:ջ7 _wG7@{w;Z8̯Y w@${*Uh>M S\Zyހpxž?B0@6WVDTu @D @8$GK>x}6>*^` 9μM$\GeTdLB xZ5Zwu{I&":/  F8BdiW:౔9;(rA8^T(_ )m]ge9A4zcG~+! Hm]N\ <ըI2A DQW'Y (5H a`@BehbqRu{9 =s&]Cw>kyٽ+|GxQf>9jN";aox{?Qhj4ɀۋ8x^^oCl6_Uku}7??']lO*wrkq,¾y< -ŀZ`o;j ZZFWƥtT/D%TTai'$5 BR |hJ,[qZGH R Ŵ1 ݢj > )?g$QegBL l"*Gѓ dZ_a3g;BLNX)s|.moqC|ǻ&d}n6c??gJ(Ñv*C7=!4$؈#Isv+ C螦ߣ4| +,aŮl᷃BĔopK!w飛Y 3":@5pjHJ LGΏy31 wBaatM6=V!@xJF2j)! HBD%|RCaJ(}mfm6xٔ +~=TR?cʮQP4 !6{Yz&ɹV "Q H3"d@ B0am6@b.ZkTP\pc K)/hVreFZ n@唠ܥ73dbTM>ַ`n?"T`[v^ 9Ցs|D1m '[7=<0J3ouC&P_/H3Ctq`:l9uEVmnO϶zn\M/o:N Rgιf\'!8P'#";#__DMJ>sx=F0i+IR&\R^pz3PT3XyHd<}PX ab aS £ǪcGb8J9!2$4?bUhdUes9;bUZ^*[)P,V%T\e5 \es<BkWJJi;!Bl3BpvpC)o!ڜ3Hu2p5' jC ;zV.W\õ0'Wh!WaWZ~{WJ;v=EBʝ<X~rU}]]|Vhrt"o?~wt2oO 6G9КRBp"8z6B008K;2W08kx~pd{v0޷D+428rq*p*[W \qB(g'Wgd,Wh.'î^ WJ\}pE5Wj4d6*NgyX\f}$ 2]N)pN^0SyhrUou7Rk;1!"2p2^D `+*:Y ŧκjoxε6`%(*ޟV0_^ wI朻!9ަ<$=\]c@Z\Or,C$)QRO9ji.*`/  F8BdiכӗCӗϲպdEeEY\BÝ\D~wCu{˙^BlP^UkfPu}7??'Ėlovx⻧ZO}ܝY3hZ8kf_KY0mmЮTrRX6Ghl!Ka={ɪjϫ 3}yd٘WwY!ݎz#-sk yZcI.*1.&Tbj +v6 s^Rl0%62DFe!g \(}0`dKY浪)m,AtBٱ\鍞Q65geJ2N/<ӳˋפJ]^^ 15Kۛ^%y-wZ 4K¨ITj(GWs{ӂL2׬tBC9ǐ;_uK'S\L̓tfgKE[J٬3QC PBLbZ*Ol3ݞk/++Qy~Vyɺ']JI ^p@9_T)Ψ- )"/JV)9"W)&?A_kie" J&{NHo59mШsd"R!2lK"X͇ѯ/\oK-Y Xh8p Q|8~IgQINie āFq]\:qG| 恏3-o.Fwc~WCa0o9 _Ǔr2>:c$M&?_zGGV=ɓ?C_ў_{:]׍Zߍݬ*/?I'6~ן>/_~?{T֊-?%zx> ]V]sˮ5];K׻SN7!>Ս:RAUS~ S VF~tM|ۮ|j;'I{dE.m^+H" FrD(lԋ$ai]~=aAyƲ`t8'$Z5i&!hw߳T>-aSɦ^ՆhO ׆PO 񜭝j 5=_B|[?Khp"BE;t໒!R9&ofQ!/ѴmP̓d."yt1! cT,("%aR #WBMʩǺ[db6$R$QPE| G2rFF&(QdcLoI{'sGC kmJ'W Az_*蒬8$=C3.!f)+v%DDd yU%2p7k^aYl# Y+qRIEQZ $b& ] LP I>ӏR>y^k>jmUȅ:[eA@99ȐA)!Sa-\{y{w=n;wFX报<iWc 'sHHv Qwh.:Q;Sp{HS4+'y+ѳ~%q փ8 5,v>GF]RP1 z+I}dZ+@#3됴R1.-IJ$&[ 0/ƮflSa&x9)kl3Vn} ڧH,?/y֗yGiY`^,M*z7i˫UarBHhz ͏(\\KI =+aLRtt> ƾvg/jcP6$@|G1(Q k)Q̹,HpXȥޒ$I%ljʉ՗AϔW^H;|:K_d{]' `,/|H o;w(JA@X9ؽ!]I0ncNKTv9kmNU$b=DHY0V S -1REGmMLm$,al3qvg!ug_pP^/ƿ-xv|n)НNurv[^mVXf4Xwo|ַWmŵ(ڴh/¶vhyPW7;unygAأnix]nE/\m7Q/o={^OP_ϭCOW߈ ytc6ҧΘYiQ/Jb L3.Z0&;-Mg-uL2jGˤQ'D U') 4ZhBU6`>JP,jY\ 6Y4DM9೐A?=Y)A*hɠtUfwT8Z|l\g4ݾ괼9『Yο:q8yEz c,]Ia+@ 3>yaxHZJbCY]F6(ag} !^oMv'|7{Dyt{ ẓZ ;]h?>hty{;{˱?'_&gW5ëUxV{0A#_XmATI[7٭FV\SK,wƥtTo}>sLgakj`i(ð+C)h4 !{BȬh*k)N !yEDPB7.jfC!EI6u}Bww,0AmpM⵽Y1Ӱ ַgXbDoW.Lq9 '}<3ʹsGIB&bRxb[ h@)ĽC~P ԙd (&v!2 N` 5DY\CND/ȇ$0akL>P=q9`.Y #RXULcͅ9EYZHXFiy)g#OJ2e8v7,>X,VI\S-x?"I9Z4"&v-dRbJ[<;Af@GtB!NDdNpŬ 0&5Vo0h(74P9"x> uZ`u˭$3=޾7Qb;30x(P%-v J`~7u 0Mb8ڎ`&wN!t#w!ݛQ+wR- EE@(K68U9޳W癃xtCsR  χ q~T}2u͛w!ilfoL@훢Ҧ#zTG\>,`5WhNj:H\ g;H~\|cOt`~J+dK˸W\83TbPǠ}F>jaFŘ{ĕj"C:6H@7mUo@ز 9i.V>1~Ϯ?0a=^Nf֧ŸЫΐ۱ӎ_m+>? aCK#(Ij,moT+#AA'ZljGn0a'!fg#e =VIh%Tk}d bчqz$l?f Z7GҔ~&7H6 Z8Р@ X&TaFQ{IOxt" 2;;6goaeբg7?P2tQ_u}ٻ ޜȎ.xb*E)Pؠd2 uB+;d6юdTc=}"GY6%xeh\!&R) HDZ)7"DI(1VN(X}TQ K)Ue.L`WQ핿rd)or9ݬIi﹫M|v@/Qz6&mxUAɓu!ICZj uR uYT s*WAkb^c4n2]U~\hL#pռn{F[kT,ȵnHKNlֱF[݌wve[mVe{۶j 9]E L }6QyټJ~_ֽR_{Y>Y@nоYb7Puaz7VsA:1 nv EI\΍v(6Qrt.GO >6wLRz3chyePoƼMi#D(K7Wk*ztkX8G'"0d} #Atu:.N?8GZTFJ/wKPDƄP)"T2LN=sߞCNGRܬgsiNNPDH*B; Q:a&LbE$:&Z^'B/@qL/s ug8 A)(L͸ Rb|eexA1NG.<;^ǍuA(%I-c8ёlAfOSTc3ve(6OxVxaA\ aӘ;:M4QOB`஀,L? iHY$*@!ޯ~b?kJ uR0y+il,ߢ;DB3ky85f׌q^3ky8Wr481'f 3pbh81'f,2pbN 'f 3pb<81'f 3pbN81'f 3pbN81'f 3pbN81'fl81'f,2pbN81"/" )В qύ0K<%0)xM\Q &uD%ԖhQjc# {Dsn2J_r!ͿOxjwMhP.̀1T)b+t@HmojRz\TkXe[̯_[rN49Utd E*aPL\ ?ÎC.0ȕoonmS_oAh_u ~}݅\lm}{/f%ZO^sQ\}e;P_6%̗ۛmִIR+3M v<쇙7{T@▱ κ)np0%sH[]FxʈAeҺn/'puӥk$Up;o U NL8P=x:FY^1HJ賮Սu{}jV;AR8?k }~_ԻfU v=sUov],|[`P1dGʨ\D/iv}G[F7WWu 9n$qxBv02_[-ga/8K)EN@*Xr02Rj "[o$,Ȋp) <aViǬQFYJy>%&U5[k/x'JzF;{=si]bn"+-էWwW=iS|Un^g @EG.J xXQq"5ᐢҢN<* 4A1\7t]?2hu;"E Rr ZTD2'TUJBR%pD0AP8@2q^X-(XgaI?̺QMsHy`aDHbXݘ{{#^n&,2IrTȎHKzM\PK"B8$lڳpo[~2QY>$`a2ٛvc>5>j5T+*SqOPhYD);\B^pa@A2پ)%"; 0!L9DgRP;4qay7_*:mqkrFjr5+r[ϝV7僳Ӵ ʬ8OCoV)c>TB*QM۳_M}:>>\\->*`HxnR[]c*Awm>Na8#Nը:k<)Ԕ.V_7ϧE61q\/A뱥zlD`8y: /`Xƶ%[[b|uKm͐f8kmfqa0`opп EosUַ:d[}մZˌ*ғ7v)nj!]+Ѓ9ۃJK;FZo_'tg8Ӥƅkhu"8 ÑKC1EfGnWU*`VWԯ ׻<& W/^~~~7>})&8}9yZI[V$VW4jڛ5M۠iIu3rK?. -w+I"~.G^a$3~IDkM4Abˈ."+b2#RQ0/)C&PC[fҥ}8y(`D.+86ᢔ+C ÄA\Y󌉶PDK}RgB;[ a$0{0nsA ״BK$5"dKvyGM9Pޖ,J]*t>`٣3fhWY5CwC#7qN""bXdC܀ vŤDKVH!۩a9pxaR[ERcibȹ[.+= ߹Q@+AB8MfoQEX J3T3;*H.,-Ra8'*A`WZ:ƨɑ>jN4Fz6PoGp*lRXEd)R!,?c5lÕlhu&K&+?f"g &jt4(^}Sh: ({s#+؍^zɯvO=]^$5hyq1VڀW_vJT_{n>LLOfUVY[6}F渨wkȩԅ<"j,=*), "bcR7+l3]R^ i/`̕':7/uks#f l5{Z=KƦc6kXrmKbyNjf$f0ene썺cbHt3SQrpƥINƓgZڍ"˶؞!9 /{Q,P`EkEr-'NZ۶(KO'3$3!\J? XDـc΋e<.>i<v>E{]=ݒԤ%6ykUNr"EG%k,ySի=R*u)7X$\g JAd",7#6LexzUWi'UmN_7W1YT癏+(HGH+CV(6ڵ W;E&E?K)–uw㤹X4d>8!'MQIs?(41%1:p0 !$<*"Kd mvFN C`du%)TtGp "\̕`5r6C\>%cP,BH xew4N;ZWqq۷wj5OJ `iALWW0~&/.|&U5U&e4V (.yÒ42Mҹle[->o g %ۄnP7ظm qUZw =)ZٗYJ$VUr9L1AQ7C*ORd9:G BO` 0\$+1ij"g~g}|3U׳gg^ {MڒkβBQ Hf#cp,8PYk9R* W )E\6Ƒb&sK>YBWƝrSBno~.W .9pm͎/8MW^o,a=͏mb7F&}mx,"}qInxbO\takg9QΏwKm6iPv},ZS}rl#樐hQۼ64J]eS 7!6L5>k+V?WX l\1/bt'g$"3H ,h0)+j&bK%'M;OQ32u$c+ۧA F5m9b ;Rxzx%)pcrёБVx.434.2 ۛ~ dQ$E}E Hk4'+u{Qko3KO.$_r|=s d 4{jΥr\ ^毟So)Tp?ݧ^)Y-}N-+Lm-i[9G߽ʱjkj0(Qh^5uL*\5`ruAHĻ~caՌ K-%8Vv8 9$FXؒ{_/*dm h B搘 N L!s9Qh2O4(T6:аC& 1Zd MBY/$9l5%Ksfy͆>31ࣃǂ9QBbgDŹ$.<$҅J˺$ү04 N!)G KL{eԡB[w h쵗ᖪK\(ob}HCm\X߮{IRdA5@,D1uql5~1Σ\*U2ɐA4'hKec=y'xr'nOIܞ$/tV):hF0ia"j+sƗKD NBdu ߉8@=5(sAa2A["SL @ %Cvi–n\0cɒӞkنDJbr"&1B*W+v\A-r6otqӺ b؟+}{>YrdnD%޾\~+N<^HӓURys&:%\gZHYƶpA[@g39E .hrٗ/GJc&HPҒ-+"L+E͵t+pTmX͒V& qƮPVKVndqL11 'ܠqYb,IEMB#"JV%F4f4.G2pIS`>G4@9$cE2R)5!(6LCX"D2 k"gA\01OEjW]6TI^%LȜ<$Kyc6+)cj וYiGŁ,dBȊ251EH$cV2qs)W#gxd=8qpգ\ dC`RfG0R(b|)rxs㹂wA̹r׻굾ۊ"K}ϐeZM~ӝM.lu˵2&&cxƤB41Pl1fL  u] ~a%x&MDlKGw =158 礜kf(]{9v+1{kUa04Q쿍'y[ w>hOEs}Z\FJT5#iO'tN^'-8:Q`c({-}A+\[`Gz`4cpH;󗶧AqؒH& }d$r cUOV(Qi)hxNA%uP*޲6afR*}:]0=Hg~z7qHI~|G57h$ o>nì&Wc\1yqQߣ `t)Qc"dF5PD'Sotuy}M\&Oe_(XzU!3XѰ30,M\T PWb/]}:=.IJ)oYXsS.\AVVZ٭laGVY%B8M%9& .=#Kƒ+ q.\VS % -9A`=qUȅWZ. u+)YN_\-;qE"?sBB‰B%N\Jqe}VB_/?'Y;8e~+%Z_|=:zb*U6g.p{eO%>J7 ciڼix#rTLH-BȐ ^1z\-8NbN69˼{NK13c68nSݻv7a+ޔƞaq*[5Sۺ~J`d84^xPF9uIKCIw̹ ;X hz{Y0ǟ̶\?N=Jn17o4 /7ܱh2&kڒTV ;t YmрԆv$k P\f|jʇ.3pl-TL Z "̓2; 馝El<1Le2YNYj4Ա#))g=%0X)T=;_0s?Y<7ivnG mf}9_fKz3Q=W{Ǜosś?ou}v'?HoM1zY:qCtޑɅ>e%Yd@o[5(ctʓ?*R&.cif1%0ڏ)Oˋ;^}NQxܹQ/6^o4ɭOjkj0H Uh#i1bh뇪7CIg<:^(6spo,G5޾zD%m4 WP\w]z["~B. nu;Һ^=M`Y~lRˆZNkl}'[r=? k>]o Ǽ oUui{N:qtpf¨a5ß6]B mކfs&gp%h-6?mLXS6 ^o3Dž(?{WF-C,H쇹;_vf_c]dc$+^,;ȴ-OڀnY)wv>G8_LۙSXʃ+8 ᱺk_٫3jZ%Խ;rMF/IEB=`s IV8Թ"Ū2dQ&EiMjU"Z-=r# `"gCFj|sj⁻PmMc4S6FA O8(kyRG7$y<)U$hSR.8S;"ÿFsMMHQg `xXkBrdb!k*TE1`y0O!ӽ7\r }r +}g(\&;)4r,B@F\Xj{){jnQ:Vg>x3[{7I1'8^>;!~qeoqm׿2:oc-Yz0HF΋"xbߍU!<b"PS|=Jȣsq@#?0|4dxz'[|<*Q|ͯR]Ǎ>s uB0sa??ϣs;yzzxP֐rBlGq5GDXE>5 kߍnb_P?u`ugЂV9e2p2<۵ hz6O{op'ygb[Kmm N5#7qk36ڔg%g僞ݴ9>_wsf{R&^ju|3jZw~a2ZzgH?3Fo4J?]$[{uc_.mW<:+̎FgOE}09x:x'l$b'Ƿ/?~=Hz?߾x˾&.Gw鿽vM+]/.R[_o\Izb~-Iݛ{m4mYB$\B@l,"R FA-t 6d _ةIzhһt|9Ĉ3E0s|碎A ^+obV*ST0M  | ҰdQg/9|@C1q+ v3ȵVk:}gcv>cI>tk\5/LvMiMeYAxFM> :,udu%cH92e<s@;sL9LUBki") ّ*ٟ"$RIZpTW̆BSEq#(Lt>o.@3qAhL7$4}ͺ0[ҫ8~><(_ טVd'3o١ =8WPF8 C3.k2XXڸ%ؠ%`HRL<8g#"֦tg <:| UlkK!kL*@:D Ix_ғ/:aCG{^mWXUFTF\a+ HYbƬ5 6E]^־;6Ups0hT8Щg(7ڮWWSvkŽEiWwX kALd"Kw<>iNh2OunR5OcԙPFٹ $" ;%tptA(s`NsDɦ z˼(ކED2_$#4Cռ.)&[ZQlGMFlLB #Ck|'TkL.1&CVZt5M45Ԍ:YeZ.g 36+f`N[ Yw6gbX/~<fdO"l,]t8;U*ٯW~82 ιȓ5؎g0Œ 1dAFt*m!![R*!Ҟ'JƮ'ǚ^Q7|39Q) r2` ;֞YwvhUf#c__B5' *_^Ud|Kw;RI˴ыh.j!QI)U`J,T4֎/eDrYYqu:C.q֋4o<ݞ6kL~;}>z,R&+'P#(W!)2;L,f%!3t3#kebJhs_JH*.0Jj6$%m$ 1d>k֝xWd_cֱ)x[~/;"A\jiu?aYzopWwtB_TNqUX::DwF*$a ϥ ϭ Ϧ Zc2tfp2bYO9TʸUVߐVES/9^ 1 ir_f(s(V'Ǘ'i?w-ÛR>fnB˒4OKb?whl.ږq+iݔ6Z!6=ɣ2I<ORCMΪO<]yuzfrQq4k~Ƚ㿽ӏӳ۴8ۀ o\jlk [`yd4XVj-x5Q`iw* A*i\sHyZxt\n qTth: 1 qL 4|BȊ$.yj1,udB+rFP6(fC>RDt{H ׈=YWKt>w4+O.+ϐm1@ PWTAvī]Tj^ApVf`jzE[sƣݧ/\hTBGP٫.(;B2ACyx bHh&MڄRHX4&9f@V91;ܐZ$}ف}3/.%W&v{nɧM5g@Tvk} s]yo#*+}m> H$61:1 Yk\`LKv H]lgVѓ%&y8E @8֧ -ז!8rcHV.cOB eUN-Ee@p>a-h/4($hBAa{G, }[zff -ZxvWpH`hQE]kv}@hDv≁Ft.#)̬#RlP2TI:TsŝK2hǂVD)ɨjb|#KR1aNG =WG?c Trʼ7W;W E"|Qb0C?PFgaXE[|7Q MBT(p5zkR߇w<]p;a;^s=Ev/wݺo0F7ewtlg;4e {S؟}g$JE:(ØpK`o$Vd8RQSDy9 Xj@w7M:MP:lpdW9%@s rD΁99xs r 99"g s rD΁99Ks rD΁99"@s rD΁99"@ 3Π;G&m?2ffn|t\rza৽wm?ތ{WfF'v1)h=_&5kX|VvMC>PEoW.7y_"(RRcɜ  gAj>Ie/GWow[n_zw 1Tśo#ki26-˽}y}g4`z&N MȈ J%/s;#r''$ָ?ހ6S3|XmТLsSSPZvyr =Â0[D]M?oѕon%I5<'g`hQ܂P{N:IKm(ʃq}!`+~;[ {޷)v2dxvrҩv:YPp Uk<.`>=Q^Ny);MזϷve.ӉϚ>V5?/L/nǿ~vVl]׫3hYvV6Xv>L\\+7 9FΩsj9FΩsj9FΩsj9FΩsj9FΩsj<1bsj]ӵW˷3-dRbJBJ>Xcc+1{%g3b'>B q""sx !3F61K)S*(Aqqcىp2qtcjўPC}mq0/󭥪][m|HkJ.grL|+yFkD*Ϲ32$;ihܹ̼0o>5y*@>708kg֢Jـo~Bf 7o·0}7]j7e 2<ʿ&ZgOD%ԖhQjcdVnܠ#.qk7SdNQBOȊs:C^0-`k U cJ#68R$#Z{1}Xe[accmRLxq|Ꙃ_x(bҁy '9(~&̶pt=l/'kovݦ[ *l{/\{ &Z1Flw6#AZ10t4 TT2<⼋10 .f)rH8+ʨB]7ykZ*8ʝ7\ \@cZ1b"iET$>cng3M&!W$ZL;5Ӷ v83{!AzXxv۴40)`/:u^rÜ%Q [ZcI"!tuOu\s gs8&He쫃՚{b(ƈ3BGH4`Oﴶ`1JqTQ( ȀRA[R.k%|)$뷍m})dk}޸}`6;{8+8煠wo媄"t69ƙ?G !QQL{J&Fs&D;*?F:cOt/EF\%}bg 31-<2*21\FP)1*ō8`1waʽ3ø)a.Ǝ`g꾒DŽ~pv5^ z{ɧl)dq'h| 㭂W_ j)2#t%{ބѰVOX8x̕^R!JUr(08أEnRY#$ґ C[BH0Zp ((hs ɦGgM%7&ê fB퀶mw|i泋.I}2^狀U5#&,2R`+Zb-0T1r&UJȨ>Fؒa_ 7-:EKn+fBe T,̧od!fR/%a^hR17ou]k8eRTjUQ/3ya*p{X͓F?,Ix?WORpI}o L Uulpp T|VwYدUҫ^GW5.c_ po6MrmkGz')ܻ㰒 Mӯ+ˢ(>Q-X,nQgfpZ  P|÷M=_D{]&A2I͗yhz3H/n}Z*="@TƱEJ1+5Ȅ+֖gCW6teCבy" tƽ&X)^ J3VIx0%5tR!!!x0:U-$"﵌FMͭUcRy8Mnƣ~$W7dlӫ(N֒:ܒNG_Roro31޽xVi)ݚI l&ʱIzjzdI%\}`>.4-~l>HunW4zmEYE+-7djoqw{ fmowس|KǍ^Å2 o5C걄}c ~BU% C_aKn\:Qw%0MӖԐCl5pZn9.w~I6 b *2$,$EkÁNy B(7zO.^FAP")9IE$s2JaQ\ZI%pD0AP8@2=>({,?V*aDm.2N5!1#Qrb`wc JFœSm5"l/7ps4H;4Caov C pKBeYm3f1!3RO%ǽAi heɔU jID$&+qwŵ!7\2QY.U#ظy&̊믥)%$`O0FOaO姦&b^8M_0C+yƞ;e' E ;|md&v(0!L9s)(ԝ4\t$ t[VsN[lBQ\ -70o}ϫZa8cqF_JwznHբZ׋*)݆o6J,ki+ia琾T%RʒFF7& aV߷s1Ӏ)έtU?O*e7U\},??|}.fwvv?zOLS(W`-@=rmE-ֲf~vM.h% K$ք|ڑ |˗/Gݲ "H~u̫\ LԈ utQK`.QIz|oº4 /@X² CTZ3n$Dƃ$ʁ>izW=sa=]bSF6',-*kG,".tD.]_Rx3?K3+E49sDeo k |L>`h;sQv漸H%(`g> Z hז52 -`֔EJ bCnN mtY`Z ߺR[Y)ÐN (P'sjfзWA?lhxNk0G TY",XFqk4ׁ`-A3:T/24es4*K 'hxvЪbpϓFugW ԋ*9n '7B&!SdFˤe\;o1\[65X+p)GL1H " peB0Ơr68tltlf;+w:^ ܁O$4.h3 4ߨ&XgLɸ+@0%%Rn߫RJy+GǑyH%"MYg e6IT $^- t>:4SsPMX>q6̤{X0q biuRi6u Bv#PgBeLuA &4fE``i_ANc;u.쪪6ٛCmOc|>>J$!2dJS`*ռ}լJ824GaAJcV4^\df#w巘<$$ZH$ALcJQK5K ?_( [͓۫5CS٩^e׈z 7ƳdXӊm)HTFq '\1Fc̾9ZL$L0X6r) bsHնĹ^jX-63ֵ--|P[x 7W4;yaI ^= 7[lEIYQz4ǐ)]U `Η$0"_lJE :cfQ9̂6?ufGøD17jWZml_%&CBd΀ GN[mFCmj uaF]9C&dȁXt%0D-H%4s>f,Nڹ87Öԗ~L0n 5?ֵX"bg;xWEe'fF =ȀsjҬZ Zd˽WfZDK) č$]S˨Abg|2&51I В C4Ĺ"'z՞dQ\z|fɺvQU]6qyoJ:dz&y'HGy)4#\CōIk\kvkue{׳`ʓPlǫ59LAFުQ (Q2,Ir6zn 6jBmj3ddNud dhYZ&Z[0vgzuv]v~Ih`g{3"3='3sog%]{1Kvfcdt4>~>3kzsUXgc\i-tsER :s ͕a y6檈 치"-M7WEJe:sb6 <}K vkDw)3<m﬇Db/L@[o ?7MgFMbi~67\{RuUgwoPI`+M\|bTy7i >1Yf`3zδ64 aÇ,-wocL&ru.:2Qf/*b)Ѵ?h 8u%7,h|qo4L҄]ۅOql*->&n>[Q#">oysV^zb:Dx4~aii9]8>P>6`,-|3&26Iļ``+c]!i#ʵNiuI$#r2dÔV!$VJJ%V|ׅv߻Юs;qG5JVy1dO eTHUr8T$*9xk<46"V`)OVĪd9W:G3$'6M&'5A JKDk&|%zz<yEq̳sV^7)W՜b50K,g[]9W#W;r1.@Ɉd%)bcs2G>BQ Hf-cV8U(IP(7uRTdDFedN 0Ԅ6gJȑ`_C onootsPg>]x=L$Ʒ4<]Ѣ0|v kN?}u{lR~5jؗ<̌[Ip*ϭ͟7Gsγ}< >uQ|FA.;Vc?&+c=?+jzS*n p Ywǫ ks^=uKŪrM/&6VE?fv*75nPx뵈#5ڶN>mn4,[DKh(5缱_˺4EPEKa!l _di2C?;8FF ~"?*w(0lc}f.&0|x]AٌP\8`BRW b'Mj.|N[фLer5 Y-A_nTljT"DžQHC%yǟ߾>tS'ozw.`7 ܋) V8]]s ;txL.&߿/T }4#4BR_֔\_?Z&VCEIbDJu8rA0kDNީ`@lBNGڥ?qx$#'<ԑ ڲRP+>)EsQZ9u8|E[C{ȡ({zw 44Ks-OuZ[[N=s99?%Ơs M|d@t^~F8BH0Μeje*8B {tyX4x0s)J+!)SjL\.:oDXʕt+W6t+KhP6@)1:nK5$`OWY 2\@I 1jM8h6])rhp/ľ"hJ{!"Ѱ[4`HRcN )Os`=IxNsIMVRGRDȨr\)MLDT6jUQkhT(paTth fd] RKPy9LZ'fZ%kJzI=IPu .)PD)" H$B&^A(,Ihh-5B+c$c kxi-'oTNg+١`P{zD IkЮJ0l4VEmK'AF[Ӷ~^`'cNG86d@_p™IdYK4Rs#Q1 hT Rg-h D4hZ 1AxGipT1rv x@HӆBTӲQ xaM IeE2\_4}7 ʭck=u5S,?3;# ׄdE]Xd%pJ#14 ۖjHZj.t2, ]i"{ԓy2Mzu)Yr*]tv}s58<^El]~1x?*SI~fdk?IMʹ ׍{펗u973۬4bly0kP zR 9q+mM1UL5nRΡ^N]Ϯ0Q4כ .5d"r#Smb[(EPFWE.LkE>DGozLܸq.@.;& xZĜS)O[cl.4W#jm2|?v4˹OƷ4;{hM'21<0t c{~G}dT3"Rdi9:/<+-M[Z:Y怬NsVgtL&:bZUD$1br9΢ɩ5s`ol֫C2CHO ؏T[QyiDɘά"FP JʘgZ pI@a ((-Gp($*h)T+|.:GP1>g%h:6jxw|GL90Zs(ӎDcLwlu" y )*A7YFjBH.YtHH"_BƠSB+I% TӚ1rv٘Ҙ.l3 yúAu*[ǫI:2\Փ_P^op_vJ`sVQ#"A1 IMAS)1_Uc&n)rkl7Pvcq_ kmhvkn_E{K \A.2$4D@:xg* &h0 wZP@-C E{3NKFұBxta}9ab<)F kDjV#nx(:brHs (v`h0%sIu1jD># 78ʚT,e=1J=i9tMٮ#.C8*'uPΦzQ6e[ζ 8s:-PPiq`LXʑ< b~J}I+IV/>^<*Mq}jXPa·}EG{8ձNwHk(KԾ\)qpbЯ~_q|F.;RZ'_Я%߿w?^P^,J\c0zT℔`SxQey曲qPst]ӹV3@+qSU r kEY\7G{hf"'jqq]?+ng ze1py[9:>ZI))>CL̆G)$"C 8[V8 q^Ъ]Z ToDwe7b[fg9ȅ{ :u>)?Ż?NWaR]]Br1z)ԥ6 Â.ה&FQJN >&ԈӍg>˅~b94Kl? LPR$h"I)9[42QksH$IpIZf쁒EKI)X 04ظaqN&tp){(M80`QCq) bTmAL&ʌdXGhkQ1IY_e nZ?^ׄiqZެV~繄n}x- 1_( A2OIyN聠9.pcU92KmC7&v҇*V0(P>g)O eQ娀'5NxH0ڗO7+sԚEG8mtA<].+mXeܛz_88In'' *1Hճ!)j(Q6ݬ骮* lf}^upSE0WRv+_h}Eӟ<ނנNCCNqYpVmc e & ¼&BP5;L \oc%-9NxQuZ<|+h1mb|9c]dI"BIHaYdi ͌ҡz'<*{WM|*]J2bg@>9V+-"XaaHMiW>7H lY܅. li`ŦtH;1i#XWӺK.+gS1zDCT;rO&DQe$Ghi'Y0(A\:R:xAƂ5.z$-S ՚sFHD!8e|lJzi>Ӑ~Gv;,)e ȭV豞L<-UBy-;AˣD{E:Th 8 0E h&l S<(xz;Өz:iN,xbQaS%TkːT1(TR^B쓒Bi| ,AZg`w*8~b> Z^hP`Zxc UXQԞ#r](%I}kRtȠzӴhm6 d mb^FQ۞]nf:-Km($U$!E~j:5~5 $#k8F+tsx3` J%/p#j5$a7G!31I(XM 1[|Wo^(rQȠ߇72]pAhn'ېhWrnG:2q68&WݾAcmƐۍUQ8Kz7 媁R'Yy7.1"B)9ht4.٘ٻ" jn"DFjN2EuZJe,7Qb'N4h3ӕM$9ڎd5],m[1.=!KV2x$ zL= Ѧ,oԝmhAs=2ee-ћ06uW'W$~IC moWߛޥ #;cw7ѻ t^%7kXlZ9~.п@aP_v[;st6CgXJt̙S!,wB!l%D/ t}9'aW˔ Mg}?t!/N(jӴ|(/ųe4f0 =&N M>Ȉ r%s;#r'G&/-,q_φӔ]b&bÔI#`ɧ VcA+_ir$d_ic gLL܆ Zd0d}Q.w>D3L{o'G}&6&ݝi|*Olrj4wŔǰ'.EpIO'lBO ]vi`"_z͏/m$G٧U}g$t(srHA=VMG=2_e_^W)a LB\>̱*h8:tN6мgwZ/q݉VRU.I0AU{C 91ފbR_rT o5`Ak݀RH81F !>rE?5X4Rz \z'0&Ғ2}"T2LbrXnrK[Jh!IxFyNs%v"bFRIaL %7a.$>TN鵢 t,P6v-GZ.GvGo.W06BlsY04\`06V`ll X.=,σ}`(DDW !3F61K)S*(qqc]q;qeɫEVdo_7r^DEǶ&#DwLm%.Vo ֈrCUs!g>!$+iָs"1.1X r I?Ҋ¾'ogͿiLr3y Kn9Q [cI"!t<ījy5-E`5Āӂ8#{$t R,Oim Ab.\ Ƙ"VFud/*X+\$`7B"6Zcgat[ }#oiWo>e}#hGxFPM3}#ܺS(ż|?>.3_ W N41DM#4`ݲ٫OdrnlqUU륃^7W%JީA f,l|vQ:B͜MFhB_{nn ڦY`̦UрmoƋx/YLXeTR*Z`pS9b<:LjQyXR*..0-E 62wFQ1,NP=Ʀe nx<McaE۳ 7T\S,ʳԜ,< 4Miib&ROZcUx ϗ˸e/k3y30}=<l 01ja*_>[~k=a rQp& 7l v.Wl/>*P i@/` m$zӗk[UVh&f7vI =knDֺZZ$m3~ɅbrI[w^n}m;@ټqnɮ;@ )/\R^ݺZ[y7/Y@ :?ƥ.I6,뮛 hϵj*]1Bv("K[J|P\}ZJet類:G œq2ɣ;7͞>XBۏ݉dMm7lXZe^]]a {}rҌTӭlMRD՗~\GbY;x3\RnP8*[Xz02Rj "Qs A2GQc(,<u끧5Cklb2{*R>qw9l0UUvn!֘i3܄;'MK:JNQP 陡+*CJ2NRQ9RShj_Zp!+#R)*#R"%1dNF03,{XDI$sbA0 囨ìrºR9jrRR&qݛ|!<ɋU;(;\A_8Q IvˑLIJ0aGX#S@a::RLݑL¥\M/@U[[*:mqkrxRjT,0o}wZgE٬08w#wE"\gWP"EⱽZ[߬.X E\<#U/:J6`8#o#Ld7ꝕSqVUxly]4|>W^-fwш#5gA켜[s$ʆaYox |ra!4э#1z˦aH05XބT,l8*`Ypp]Od1f?]wsyToݨM6xgQ k:QJiCeu ~^Ҽs/ONE| ng4,U$Z$Z ٨|׻D:V/%l\_$Oz7ϯO?)&N߼s[Q# , ;2pgo^}u[CxT-ͻ&|qU07 ÖT Q$YTnPp_o!auׇZsP}N:~ eK1E\#T#Kʐ#gцZ?ĈÁW)hedžS#\aEGQpBDV& e1(c^R砪&vNDZNDvL}ȫX"2nt\ZaWuKO^O"ΒWdS m΢ԹL10 ie4ߓ9ٻ6n%Wcx_V>*7ޭ4!))J(Q|CAޣ9ʜ#)%4`E`#STXˆ kւPSe8A ::' xD2d-S Թ6YaM> ~lvq<ߺuLjSFL3H& E3G Rs,ViNr־wr`[^ΠD@/w}Cl>*u457@n` 6hP`KJާ?qoXZeL" %ghzB) h+MR'@lU@{iBBϳ ɀ_18Y!fѩ irl֜;G^eޔ5KFHȵ㑦8ٜzec~y'Ǟ=;n;?QӐ H(QJdX$94P4Y$тd:I}r|A$i}[ܥנB"z = AÓgdLw-RfU'o\)xԔ[t8)WsL]hA,M{ ч|&֫}UB` =cqVAhuIDn֯Kgŀ⺚ \4DnJ{JDJҙ GY% y”+y B!搘^ 'L}~>FyVIcEʡgChPh RZ8e$5c0yUT ),+w΄3* yZhdn՚F.! )Vاg}4/+/b>\I=kl=\=Z՞J+=}՞VW{8~ؕϤ5GJ(aUueL-(ZsKϰߨH{'d"9JE!t7zV D %VؓBV:uEr_j=djړFEPG>$FT+DLD–*%dˎKu14p duǀ'JAIe+&/" RhAa0+vXl6ttE &>d6N3qO!z_ 7/pZ6Civ֖d:C 8Q\D+u)&#P{=M3'E*\6gX 5sgX9wX^Tj#c[,PXS,\:=+McII6d;dQ2ipV$FN%Ff1Ͳh4i*++n2Im"< AHQ`SjbP4M&mǬ5D22[9;GxEb jWۢ6TFmQgj6ҫdApJ4$eef [uV(x]V'<y,𷏏'[ A#d"2XY>#FrP:$O\xg3t5#kg#"Ud 0R(2$)l$pQX{ol6 5O5bW8eY>hCnѧ{Y7XftK]HZ#.ڬron?qMpTALmv:k 6Bt4d#{^>'7OgUU NJäHhn9I ɜO$4*9'f<@2*.>-v>eUrq>p ǓK%e7?T NO?_8;w_5?'txC3FϿM??zʭ=cL`+=|4Dl͗LD)d]n!Tڨ-è?5h:6:;-ִN?? 5{$Xa 6_NQ15?w7ۛjS|m>1Ou~xׇufBҭ51%W~-F)AX(Q;ތÿ4~.1Q ػgn62BޟҬ,2?>I#L|75, qjMA΁uW]l0.h.iNjf@eN=-p񡦡1͏ofefs1e66[7_kћ)o4}<wm ޻+ZfX5g'e]l?F|7?y{ _5ñdžۏ,џo>?|Os[KJ#q^;~QۮF@ΎYcFŊօw O^ TbݺXdX[C;ιlT^"u1_.fu%(ڸS?9q<>yC-{{n!1ܸn{|\.i -g:ǿc&Dw܄{Kz׍WF \>fwO'OW!톆&3;.Q,gcn;b&abgD\Ϸh߰ĽN]AiD oI  bxddJΥ6o ĘA` Z[g$h i 2G" Ws~gQHE!UW!.ȆG-'oe9*du@C-p[0"If'A Q;Iӳ}I\mlgA[''h5l+_+N-TXxh-Ӷe挀PWƾY-FJ '7k˔VF@ZU %Xx6$lс4KΝ;ӢeIղmkk/=UiMć?"A@J\cRyENXD#4:MlzMx49{{ K2f4ݪ5z!Z=.Zo{}DFh шx/ZŒ4mG'bq#'VJZNcn Q^pܘҬ^$պr>NablƇ e8)W&s0uY>:CA:#sݫh-AWɗ 46Vv^_/gm [W&%us1-i1-2x,bӼbkUk>3'6q(,kK 6 U괊`Y׋b]7ֳ'\u9p-H%)gS*WFk4* ē t\Ah1KN+kdƞkZ7G Y̭!\q!^=(N5,ۥBkm串!,1208$21~F==%Qg=I(-H)4ُS=ܪV݀ *Hՠ⡗ɘ?=+4e.nNký懼{Oο0>k9>=c<:S߀/gap|Z#R< ]yw@\ @FwAL 6E]2@uThߖ^܏ZoUC 9){zjy?vk rz㕢hiCig3_7'o/;Є7]^ =X_7̱gߖΖJ]6V-mWlP႟=orUT`H5kwFeK5B)T{R[. ~T;AWk"?w"r zϤ*L6Ӑ\ WN:l^h(-<:$7*Xeu1 oػ[ Sj, ~{.K¶di?&9 -,}m1Ŵ/4|\`Kz;hރ|._>WtQF)bvv\.R!hx 1BqSpăz֬T=(=.]'l5O1T([&9tB.rr5wWr<z9yUryz9B9NO6}L-*ljOT IUI)Xm@LGy>hM>v.>LZ[U-']5P& h+&bM&ȸ28%0&jگibδ S3m F=3mfe횻 /ۆQmbnn4Mw xg2,_++OioAgǔ#<;HSkꋨ_4Ӻ2^|~)i1#mԣC+sbrE1jńHe^k噍MZI8;29YZՠg$-,JKH}I O6% ŚruEw=6h/zl5Je䈺aaK"؆;3J|;a7d\qsa3yCe>|sWC ]iv_Yǭ[xx佃pٰ ^Mn}ycd];z"fH>hU+7AJ|мO䷎_)юZzdP]嗖f$4+7ҎЫoܻ}Wr''~'c6?xj)s<v7tF]ޭ,a"{ ZxD k{RV,:STj~!2a 5i %M/w]Ѡծ_JvJ8;8YIh~Pưq/gqHCt]\vQϝtJy.6~gຝYz NWRWztq]`;+ BW@ Pו/,cVCC]+u_]J:v1%!`wG]\awϝgt02co_@논܁}դՁok-`GMw2~ x7}|}'vi 3R3awy~T#u/Gف]o}\+O=x/?4'JVE{Е}^^so:k:}+_7d٠yJ,Hm_}޽C:ۥwPCr.%;;W1+*u4U7@݀=4\v֋gOWRj^% ̵6fj;|y^J4c4S jp>,l_8vowNv9Ggҏ9- >כGN1M~'OQЅx\.8wo7 Ǔ h(OڦuHB?hW8 Sŀ;;)t/hy:VHwXn"#fޚ&|+.{= XU3V~T}9-@EqAw(6콅Yzy:a;[f=]=?+C[3&G5Ęd,I xi#Vk1D],h9LC̤aB mCY,E80sU-&m\ FBr7vҤ-e8>[ g 6:PY6rJNhsP0EL8eDkK Cc.:K%sBgxTV.ke:E :TudZ0vy IKjrت̪,.̵PRXu*ƪI\&h3 a8d{ +q'GJi2-}E hCW+-Ĥ$2 x]a!Mb*hD!lMnqN /+܃v~x!>3Ј̫-I$~.g!MfκHZ5nA>#XǠQdĿ IPyJܲ2"JZs+H%)fUzǪ1 y ~MVGknL>L ox(RWN8:@|!5Ї֣z=&Xfb zX TRC|"\ט-D@,^'n5/&teO2S ^Hgx UT!H0fx3!e)Z3^R@vdP8JjsAvɑWm) ,eķP46AZyieS Vو !ĚT/Pt@[hGqh-ĩ搦 mgdWN 噴UAŒ5T|Uh27s EQTcpmVeқ"[:*ih%}5Þ1T1,KJ8Jm/X-M͎0#vC*Z$x@ri+w%CA`KFdj)uA1V Nd]<+0V(+) lʄ{#cY㨠T0xJS2 sYfHPciYD@2˥tr\E/PB]^j؁0g ˕3V e2aͷځQ1J  τтH}UTUR: xDw.UQ(,,b65`&s@C\JLREٔ<%8TRp 3T ƥ "3I_~N! jo]B sF%Ra U \Q-y{x)J6􅲎o* B궺2ZzKK FGU]%PMkkDX\jEQM|*`H fR3&iBA~_h;+F%ӈYUIBQ@QIbY"ip8#`߻ia;Àut|IςtA;UP QȷZ]`J :B$a ƋAyY=T4**ZHL.HZTUzD>L:&q '~ڢRE (%) >@ E&rZVȼ"!P>X\8 e:*42xOA `a($AJu+7AVdmh > ,:;*Pɉ᫈;͊<-&sJ uYI N;VxM{s1^wy2u1qWC""h6@@F84=ob:EjC$*f7PKTGeu $$5&(ڽ,|,#m6T B;#utѫR%lP}`6Q$x:pZnkT5>pPР td.h]~63*I@8KJI>@2~ȃ!2ءvGycP)EF WwE!ܨ2̹+ c;gQʐBPDYl9i@FHG,YHfQ5*)8Y $y(m@*gU x/zTV^^!ZH̤E>wNr*!K֔ZZ4fs[L:Mʯ`pnP*i,`MW-d0l&x љ4 v=ځfh.0kHZ\PUFs@m1P,ڃ2 |pP!(QC^"o*$9Q%r#BwmI_!.H~0w``ol!#Ř",_ #(ilq²a {ꮯ:]z.!ּSY.$y Px& R(Xfi N? @Y ;u;`Xo|_,g"&ꬍ @N} S~FJ C,le1,ɅOBBUD Cn_PKa[%~a<hC< Q8Z cZ&ؘkyDlGjeRB&Jb䃡,= "NU +㕂 5NoVq |kP&̌eeQ]xQJp"ڔ#= i 7]XBN ,FpܲB jEv聯t z ӡ@%rH'#܀HeIC1+Do^<}ÅHjiTꬋgQ/[D]6=f9r^ "C ]O bp`- ktp^;P]%  ND4xoK w@5?L/RJ"qsqd2oR_HHx UmJP\:ʬSn 8|}Q֟b+mGI+\_Q2+-Gvsto'Iy:uf—o M>Xu<9R w\7pksF E%a}b}b?Eg Lcd.E&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!x@$R j-T@ ѧE B@G*9 F!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 tL a`nls&) HL"D&Q2B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d3>1@`+b&H3@Jn tL 8#@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 t 3ҟ"9 H"%]q'w%K퍻*޸"-?xtUl]))T=rW !7KU_U.R ݕF>+XiwU5j"]))]*eBW ,rwUUUR1+}ZVZսqWE] ̞CoNKy FWܶ_\x%rUaΞi)qMU"%?'|o~毃dm`pߑΉ:TJJFo?QRnq}~PIJ::7m '.9䫡G_&fnSWR=E_^T\1p d :Yg/3нOfa-? )fп*Yj?* OFY t6eU.TFz!rV{90A`- ̷;c(ϏhT`Ds \Q#wU֦7km_H=twU<4/⮸VLRZ޸".덻* }H) #tWB[El퍻*Zw% !L*R ݕ4$~ܕ,޸".3}qWEZc](tWG讔Q ]mA>]iEJkWt9R{z&E8}2|AAY'G7Gorv(>O-ZT%W U 6nRWn._́sbuUa\g'x}}躥v.;|79[Yfۇ|Ob?N~(Df#o}t)ne}' !\x=Z A-)wp<g]||=m/R+]'LG}&%o?>^`ď0)2Ux:6wy=*<8ؼO>_淖4^a^5UOmT⼞;P_G޹yvt`EMus!2Oyl[/߻ŪnZ~n}t|z|Ά?ۗo۱o1[do}1SEiއqiGc\t Sf)|<;]&BXYWp z~$&vaeCC cl9lM\QR&VF [.+Ǖb;0陘7bR'_MFprpR ܁ILZ{%M6۬'a;kEaF? L^2oi\Z5v.[)gd)RdY[’ ӡ*BB\&.% a{cΔck+yQ[%Qtl+l h)c)2IeCbT{DeSaId`VV!&-1[,FGE҅nCp'$`Im|&@Nyô˜ "1\K.HV>h@%VT@;$+KZ\1WV'VH& YTH"C"p$)'O;@; 0^֚b2/j)S)t=!5wZJ܅;unzl BnC[e5pb.r mhKy6Y`` 0jDb &-_yYE2@;qmd>i)z{]ȝK4+aXf`)ИXE#k=S:] (Q :@)YJ3@,$20 gBfkǴ҄gR@g< zL e|4fڥfWX`͕+ gΜ~t~Ͼ}7D`6O}^mZ\Oӛzl>NvBT"%?KFV6LB);gSWݑR6uYGσ9,53: FE!(xuªHIdɂD jgM$ C 옍PxTIVeY2@r)+FsP3q |H> /H՟[uyXaRNvqۇٯg b ]o& b^U¶N ^pSU0B+vw>.]NGtO,BUVpVD\ #aXrH@RmSSX)Ŵ y$$LL\w c`G7A0-d)sj|L.- #93w4͵= Tv>_Sud7N(*P}NhfARbdNu.-?_ґuu?N:ǤT՘UǨeՃj@aۃWTX&Z0̧71tmSeg?im[cj]Yv2hK]gaXW:bCԕqAT"VLPWp WtLzC !Hfc2ZWKMSNJAJ&vLRR b޿ռnӷ}k{ SYu9&Օ+AKĀ!v{&]uUsug4g*hwͺA;{7(G˭;wCYu8NJZk@ 4^)s[Z[emnn%-dž9[~u;˔ %_h]2T?q-Kc1rEO="ƖhK)@ÌtJuK u&g]v>e'rEIXpRsiȒh$d+knHڙȺ&BnX:bmN#AhYY"@lzQg~eVV&M/L*`!JGDB68XshzH1TCeepvfXVhǸ<(R>>D<5$$T*jbvĴ^(|F$iE G@Z,7c eLȔky//qT677 lJ~͟5%\ o' f+Q.[}8U|9U-A8tHȋ?Zq&Nघ$UD=P@u.ILI); /:oeΪ䍣H'pW`aꆏo=ZoTgyjIɼ_ lC׵1}X\D;a{96}_B*F|u9{gMZ2]ǸNQ;hT Xibr{ ɕ _G1D=̅W~|ҷ,v8RTxT]97~^!JbfmI n/b]1l}1\Z[* H |VǯPih,oV4poO&)ra` JE3`'yb8ǑIHCuj5n/`YبH`O~:3&{hڨOɔ 5#u NbbzH76C48{ {:A*+[RjPQ;iA:gʌ[u<{: h|6_C ͉;-|,\-[:?fgh<*j 0#BgԄ"XK"J-,"*N۰ZO ;>[4|*^` h g&IpZ\GqQA\FkLB7ʓvs7cQ8҄DǞ/v3BD\[vDHX:4;ZJ,~jBI2A DQW&Y (%H a`( "IgP7,cX\FŸ׮tްׅg/Eڰ῏(|;bn\fp;ͷX9lS .[# >U1xbm57m~26Zcum-i3RoG +\טc0s+S$iuø |B-&uZZFWƥTODpvs1TIFu)!I $եL!-g:BKڰSmة ;Tkf ,Z*2t0,"1B>-⎣6dS&6d#'Bµgc۰GmrmԔ=DMiJ>ZDS+f0a=_֚jL4i)pE@"- -N}p4:c2A#\YD*m2br,jrj>Ͳ6z>$@ }Jxχ?>o nzBGQ>A$c:O=1grٍQIRk9i2,ۣ% E? ?h>C!Q  ԜN!1݂ϮmxȟoD=_ɖа/\n.C7PS,8qq8tPΦfɶ(Eb#Us͵@x3ѱ-y(V^IwǤٱ-Pm;c>m/*A&'IGsOG!ɟ+b%%/Mmd*|4ZmFJmo[am˖-o3[j0.9ϹW9 L>W*@N}'X\P\{` D8.DP7LDs @'tZhC@cչ描-`{9y,HXyZ]n}X~?k 1( rW# 4>4'n "g4RfQ!w'.}b"FLg(ř@AL PFU s,P>Xu9F/@t,qJ$Gkx;bWn ^(gg-Ăo,~zKkjdj331B9A`Š*M 6RqBJ0@K)ŎC'JVSݑ*v#p|*UU<"~TP2vVpWjs 3Q=wCArh;K<&[z.Z%f&q09RA0LJa OQa"mr1^gPGMۑOAWԈ{; l" D* JE8_[-?\k"PgBPԕH]BtRR~F!/O%IMJRhpR*!$ZqZGHrCŴ1PT 7i4bPnA8zm{qžsgBȭ11 #A)c A3IgA(ZRx I4mMEZe{Hk0; ]Qpnᷭ~Tl\Te% yRNl9MZ-[u@AADW.? +޽=Wq5Mn 8pZY.ۭAQ=ox %7m1Up?*Xwdk+lQbPŭ6ؾCk0LMTD<(nUI^6΂9eăwn8(ճE{V}`&DM6rG j) mdب)!oHBD%|l^wZ/Jl%o&۬; q>^fmPrJ[/ڃ\=$T(MgjoͶzD0, c)ia' Z`0h AsZ#DH\5DskAsX]}hM@+\#>Ǝ!0D&,؁wl~ԒFxx: ^Q"919HBq6Hh}`\"uL2O33zBO<_N2dz .TN9M0҂$(RN丘R4d"-4u'@?[ 89'ST0CG" ZOa\S5%AhګDiVsR0BFXW/h8d2tWDz ⍹PɆR xz p4sԒlR*Ih-Ϙ͌h39tZڸobqZ Jbu98o_U c}Q>MewYm/WJ:t(%] '?x{q׌z<xd}>"k)+/O sQT1^j[$Xwըߎ4)xpP\}һw[ W鿱?rc.oNΨn4[S&WW,DZKz\IBM)XYԊ2O[Q{e?ߪ(%Jjh1p9Eh}q'i(??`|\{FA(bb(U)// tA;)ftԫ<1ūo&!''ne/CLvu_Ofv^}uTj9o p(l_  wX~R|\ f{` ik[-)Ngzز-ɖLU nDNs.׍r ?ݿvSO+[Kac)in_.M\Hy6׮.ˬҋf>Ѵk`խ䪻NmjO%{w`.|l&98 VTz y=^wb]ՓZiY;M]L&$͹E/Zv6wKfrXqCcs9|~=fFVvvsD+{yӃWb׼]twnӋMJWO i"e}z*p~ ]ŨL|\]] e3eNbG˞ZV?mnf#xP@ԈѠcAg@v9dm8c1$e"vZ54@!  >S%8?k*$B4Wa9+ ӟ3oK* Z2OH#e8g([1%l=U!C>kfi~9mKjm er2q\tboIg7kk_\eeeXt02SoEl+Mt9{qoLJ gȲV+)\Iϰ+I-M G(%}RWN0uLm++ ]Z[OW%ڞ^!]NBu `쎺*pmgB Jk+J(]:CWW.u Z T=]BRSQW]]FB+XwbW.L쪠UAWHWF(mt説bPĮ ZcNWyOW";DW0U۝P;5CWV] ` ZtEp;te%0 RꞮ{L՗ro4^  ؊yA:QZ"*KW=]U3LvQBHv߆YNom|66z\뷚e[iPy@,~gvUP++=k(tE쎺*pmg z*(UOWfdC .BW- JɰWHWA!"j5\u Zmrڞ^]Y}jUlmg̺V~!CAizudF+400oТT(hߵ6 6 foc<߽yAtu ZlF"(º E膃PMtuUo8uYY|hXz\GwoE_LFtƩ1` 4mHv{m|S?lø[n"k$@H:shdT+_/n֛YJM8a"rվ6ӧŸtA7~+"_g/-F~)^^vy ^OiǗupw@L ~/.ojG^]/VKbQqeIi* F䨘ZJ+!1!c.WQ-8NbN69˼US $ $T%GO'h3V "9cd:0t.3[95n58K\xl~ҢvzT>hvKޙ'`;OjVo,&4,8"(O~K"p2̎jDɤ:2L r[?9]H7+Ut;64% ;F.?H6=L)Ye^:4tX1i> s~@j냼+LnjyGTꉜuAuVu*u#j8?\b ~ޕwLƇQn6oiHw*rosjw񕣶)wUQwՖM;뜋=N!u5q%& Z /T$J7 >E2-yXog7疷ӛǘ,y! #9f])΅O.}Q!llh: F@nClj|Bx P%#GXI-izҼԓWˀ]i3,?.P;L>z‚Fi0Ih)+N;'9 |Dlb,}2 UF0 d*MLyc<^ EAi`0m *4>_3V4>W)r^F;^:{+ieutiOI9IOIT݃{˶q:qLbOSV.$OvŽOñ[ {!5,ťY|7tV߭ƲGqA> FBqpuƃ9+, fS|36=wNecq<oV*" ﴳW_D __7eE=6[rjPKT;IB n9CjFWӳ )Y#!kP h B搘 N L ̓hT'&?ꉝhj(,nd$*iP$lt,ա>hP K VXe$7767R*u)/Ff: X[D Yxoaa2^ Z 7bC)Əڞ`۬)[;gE7Jz):^bDtv@`-ؕm]P";_hiyU˿ÍUJYcYmϨQ7NԀ>=inXM+5:l[dm+LrBgC&`y0tX 'sƋl#(<}#ZdKI,GnSyo9@"-Y{#w;&xIRv\3%FPݱp5(sAa2A["# *%ԗ-FCzi‚]>̔['Y!^+^^D!NbTDWP{X-8wtvX5D.ŗ&LO=`igi'ٶUɽOt};Ȅ(=K@5. lPq 'VW1ו0M1 16021[W.E̙$Vk8W2TmfafUqaq,\({.<)^#*6,x{4qx/Zڼ/ 4n4\W> }`QfԈ{gUQ71Ͳ <] ~ٗ<}ZF;1 'O=b\Xek dOOv­Jy;3 753aU_ٟ?OeL !h6J$rGϤtB\rne]Kzs#'^LZ s*-XxM7v'W)%L|9a`e5.z)qId:EڒOB-+;w޾3Vw8m2"ϗUgRN>٢o{B9Bv\Y+ V3,7TDQQq'%[K䓃gi.: SlL &'=3R܁0Y$u ƜIE4Q!;Os1(I*zz?FC_Ψ=Nr{A{b{BWy ": V0iGKv_tF1%qAȼ$#-N-CO G w.&J zDŮxF$J@Z@G3vƕ J?HɇhrV.j /@D6Z~g&!5Fɝ0hU0 (-'Md"O Zxb~{I==|ṃ$.H 2},Gk *DH/r㱐뙧 ¨; z Go_9:볐9Rޓ{ IZDbY!넜eQc0Qg >zM`;:mi>VE = QT `yjxB/A4I_9k@^U ^rJʘM >hUB(-ld' H#h7v GoQr O~Hw?=Pf4wZ}L}N?qFpp!|OfT|}3Ƌ/U^庞?_2&x8ZN-[A5K ~w I/E#m `<` 1>%FPa[ _!)qdXgz_UWW9@o q; njhxpP޾ݻ햿'/ŐR654P?žxZ OB 3D$y`.bg\#;Feڡ*g UbϾz(9T*Ϩ;? | |UH)X"bQFȬњNЬlI?5;p>}qLe7r{C4Dɤ's..i%7)ޙ%T~~8s(( ͇g͞]܈C2>=~<`sU>T?S yS1{L?/,|= 2XVvR]\^|myZͥ~R"[""[Mf;1f4S֩~qүIW8E>o%PwzDcE%p9HEZ_겭'EF·b`FZ]t]AQ¯5u^/p=.a"K:tVwե/5gAkLaT-UjIz\hhj,it`0V^Z.0\Jƾg-i<0SduR}۞t}H uێ.{洀!,; xl 咀.u0%oo ]q^ > +ѾjۡЪoȮ.4ep7o֪y[fA^ ts[#!-`Î0EYuέ+nqsFVsKYBB:OAHx M ꀺ(u{a=jKL-k.PHb1@1b"iE$ZW04ˤ!Y hS^Lkwö6HPO1!k4'HxOxCf |Ḫr_ѥqy7!_Y8w}W0F`>WFS{´nf/fg0!ηnS֑\ ϪI+E}VQ—GD=J d?t.QT J1u3 ۿ/:o{0XUiC(X4J ƭNJⴌ/qy=0)XӏUviJ&R(u}%vGr,"Xi.R,ۨ49VP%ó g >FZYU<LP?Wo^/Nj61qf\O"{gR=HV *Iqz o6P`eȥ-[hj47Yc3mHQAɟ`(&)>(WӁ]ƫ:'[MLV:dS}Y%HMsJSOȥMWӋ S:ؚo{$1z+|槟dН2FhjuG01LwnI,C:<l̎pg߷+*RUuzgILo_}~?ޞ`N_} K$Iy+߂[Mp;4jڛ6M۠iIu=rC_o8 [`%—,Db4#<}Yk`k shN:~ثHDkM4Abˈ."+b22#hFX(GY鍗!G:76LK^?v#pjR"(; : ÄA\Y󌉶PDKL3hښEcEcrϋۜ]W_ݽ}V ːbVk,y]QysjsY:W<f}:<9kC)N/Tj4vCP q`8-R)rW Opˠ5rփzu9Վ 'nЊ{@v9ιDAA3uTh&AAj69) L|b_L^rÜ9Q ln%9Tp!6d,Vާ [~VVpL#QX:X@]1IǕOڂZ,ơUSЬJ[DEd@) ?"V"D45ré\ Hإc|r Ѷem,r!x>}@L1\~WG8]O-h7n05-xhL-R`T|iLDe9qp ^l6;]6Bp>7YUfLCx˒] hE_F-JTje9*%dT{ih"h@ŏ;B `TR@BDiە-r8 `5fv=mrHNQ+fHdr1iكQ(:/seYP$W\$\94ǃ渣#;\$^+"ԫb !ᑲs”lWseՄ x$ PX-H4֖>јD佖Ә|(Fs+%e9jo%y_b<*4 SL_^w|r7pj4y"v'sYgȵ]Su-7zp9. ininc"W6J'Ou~iZ]u\:@q1afg%(Ct^^]= g1ZS̚0=z4 d%zn^U3ԼPrzH_s}ӷ:>oAfm^yND-}^X(K;j4J>o\ɛί݂{ Mʸ'vo߼ɑV}V!2*J6DljiҲHȡҪjV07ei7k0I]y>2J^UMmEYUlR&Ǐ>S@V1ibOZs.frF[# .Qp2O]]puڱ@ape(' jM *lj+2WKR.y~n/t\A[Q ) keq]A-0w\q_\!Aj5ew(j;J^1Jq3>Mpg'vફ5w\uaqWLj+]~ .שQpՆ߻JЂU0$s_#_gK|u=o5>]oJ?塞v.~'_? Nn7o5s,ԊUZ-܌V|vpvI߸/_+nc_yh`6b^WzGۋ_oN7w;yw/;;L[{y]gZP$gVr%/?nh-{ ޠk狻4OOpygW7=;J.6ļz aɫS49'*w4mE:o~}z-96כolkt}~vve wg2Kza]ƪu4][ڗcMoA<Ճ؍3N=MzINlzOirsm'tkT=vG7g#2Qcڶ.A`ZV=e. nT4~BVj2ՒUG8~݀ꂝW]QpՆE#ĕee}WL?>a\ѣફ n Zpu"d \Aq1)WPKJWP)'h,:\1 3 +vvw0~CR[purN=\ 9X+5nJ Wy5`VV;5{\u,:B\yx'8\;DqU1*Hfi \-8؃//y;rK2vlzSЌ(gN|-oLyN,^.m66Q]fqs8Τ ! shפu^dށNx0ऐn0M S|ISs=uO>Mm8FSTzgфLLM31s`c䲲 jq#Vт#ĕW;:IrQp^ *W W66G; r%Qpfl*,:F\Q #ݻ8 C2M=JWG+^5N2aUWv*eqWLj+Q +&7r=+e={W]]pulʫ~0r0 j*ip ) *(#˥a vUe"×+ޱTiGN}jܠijp5M% W<WMOmN2x=tF̆S kc-PnQx~+Mvײ8jZh$ب0orQ4Ls ]ܞy-Y|1Jx4ϓ&IMSKz*݂cĕ5v+kHq@a= .t\A%)qEF@`z\u6օ㪫 Wl#+n\ՊjsTq WΒR#j54FUW+OJQ3kOuV|dr ":I0$?b?dmgDM[0s\]uNW]#;H qU{& 玫‚#ĕs~uNlW].(jq1.:\(FrW,4Γ.w]us[Yp:;`qUI֫㪫4˽cU\eB@c0E0x0MgMS+:6wʹ+|,MÌ)2 .(j5_{J/ W&(6z \A|OIr=QpP󵧨j#f\l#.MIZ?{wV\!(x#.XUԲ=JqQ39I rRફ=i*n?XT#JKܻr54f+7/qμGJ})o #J5'|u LQx_]x݋|wwA:[/o.+Q?ꑵs˽{WWmBg5៺d?xqqMLB䫷oнK?W/Qxzssy`yP 㿇_p5?vy~+o]*󛟶|᛽5zo?_wC ^~uև{{p]ԞDOyҜ-~mo~x0>0>c#^?k=qQE Y+旣#eK?guaj̑y/q dk}3sx M]חj#dh؝P!|jmIڥ@ΪTuNg]o?I2YE;jo0^2t?ob.WK۽\δRA7WUߒ@7J!5Eu\b["]8xvf$ &DWJHQ:fkU,"QgS*&kŅESU\̏3*[($k~\hhf].6U*MС XPߎM%-I3ՒQHU-Vj"BW>yj-h%JFcMTA1Jm+:dVZoC45Wo_Rm֘sI:iVB4&8)BXj!O",=C,}JBqΌfl8hl]@ؔMk%ёLYUZN1*'E\ >LZ]P˖""R:-(6SE)d,D2 QyrECW5/-&ġ̎Bk 'HD+`GBU\]\YdǬMf!!(RdS]xdX ss{U֢jjL9&SCIU ׂN($|I'sP1FDk}Dǰ`ЏkcmD6% Oʥ aH*!!S`J5D m{zREuHF5媒NجBύтI8Drl`]P{Yt3B FGhU#oR*Ftø@Jڨ(/ G#*mb= RS,I;)hV \Tpt["Nkzx^jvó5 <(EJ0\b]rW%2(E's#[Mr&V ps{Aw\ZBoLVEȴL%eOUpV!YX~Vk:T?}E>{0zX E"(hr& (/POz΍U"\ "U+kRL e \J<$)Ƞ(*u Nড়2XcqYM`U:z[C 5KfX57]T0~0i ²ue"1ka$ A IHFq6*qLBޚtXYpw` R!&S%9Sq% br#XdCF8CP{Si*3Rp ts 㶊EVPk͒; J1< eoSR1ٕKEߊ.Qrud@k..CZfR)b*B~)Qtb5ZDF fb#lFYk\@ $DO=oax"?VZ;4qApD~rNC-&ĥ**;f' 3De UTIk$B`B@ Aߏ c; {LoUW}>~\Adz|a: x o|ГaƫW\~Tdw%Jh2f戇ty@CK6s ^;%FFw #TA(<" U@%5HOO J5Q_^ uec4X!C|^OqE`1\ƒ qrp {u6gT+F,T 5,J1<@J҈j0jAG=ku}[p!s6,ѳdD).XZ#eBύLWj FU gPqQ;byPbҧD2TSEc?e /Y#~\[ἨSxYkʾw(U\ֈA(`v0hYɌBN3`h=peޟFhJ7F{c|Dh8{ =G!Pb.(⌚Tt+ /A." 9mzm!$*.uBDC[v)&YЬ;c'\$CdO]u\N0JklB8KQD Ax3qu?{:;nzՅZ8Kk|# J 2F)6((*E!1v5?sֈYY8nYM8MX}Bw^,#]֖\u׋.U<^iKrY[+"ZjUmig cx^y~?dkU Gmq4Ox4xY tZT|nn9734\P{mSG߮ܤ1};_juqaOGlrV]m kΖ5tUYLoG4C ; y|FWa:AVn]+M=E ճa{nǂ#+6oTVpNpU2u+e}™ WOc2};d"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ W2\[ p pO`_+ W@)iwiru pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"կjBtِ W p hAh\Ba Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\+F` WrǤ;vP[^J!d"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ Wd"pE+2\ W>q0 WڊG'?ޚPv= §'TJ^ T#6JˇSqpaJA?LA{/)y@t 6<싮*\cBWuc+Դ+v@tb *\;=rƏ*ʝ/F] V;eJ~KW;6j38W'6M+fǣ?]!b jX1dox5ZWK3Jsܣ}s}Zsꮺ֖A|`I9.ҏAJ>MR#H;2#nr5v̺j-, ^< ^) j=i4'o4n$͗IZK%k_*~ pլQ:]Χ&O7u[hJIi߼ہdl!?d/Xw;ؐoͳʍhnrq67c3uƿѾ_)WO|V(rVMrN]|5#?9U#*LeUmlMmTb:zTz}D=T˿nWv5[Z߲FJi|mS?ڜR#719vf0 zT{D gSD#av@t^ABW>NW@i zt%~H *\BW@9v(#ztU}F ب+uf0ꪢG V uJÚ+9p5 ]U;]U%ҕA(j0tU,Z%رUEh%UR0RUkP誢G?yHWi冴4DW}UkP誢=n^ ]yR0R IRUEkkP*&~3t忰i9J S#!R>/ȷ圱_ِr] lreϚ9?8|IZ%] )dXXMݺ<e֭֙7HXI̺0uah39PCylnxhs+^ WH=h;v(&ztUzIҕHr0tUJ>h>v(!ztX 3p ]U骢tDW/F}L ]UB *Z*J-^ ]/x3 `1 p`UEkUEˡ+˹~@tw<8 ZÎ^]U]@r\KDWv ׸UE~ҕu׀ +Ah?uWuW˪S&wt']X10=C|x0כAξ]=Q;=7:L0擄%x͚WUY|~1ww,q2~]:)d'ޅfL%_Nl߉v֝!a3;k: Y}|k*Ya&j 9f:[lu#.wmu_?mUdF9b)JXR[SL;ťN'z'U޲s{@S XWƯOŤ+>P~w4\]h݈~yZ^_x3m`8<_M.V!N˲ HQә>vEXߜ͛7V $whA%P OlXlba-~FXT^smJiy tmPU5:>yjڶ) %ﻤKp'{p}i }nEJxv~qY"%gw4gB!mJi1i3[ВhE, FE_: 9gބg%O &!Xf~s?.u lo;l.+uN=_ AdR$}(ZGZ[MT$X{\jE k,NnNZ/׺N% mo]x9\7k iuLS[/g3(~lcyj>|_T߆6Y7?8qHf><~Rr\=]|5YVD~Waq<[I%tSIBFK kmmyAIYh`QxSN1 &ke ;ηu*t+f髃xBR>|wWs]/6 gs)1&+ŕum))o2 N]X8hwmbE'^oCU9\uMˣ: JХȶg)K /S7Jh'?eVd —y{jvOt۫l|E)W{6i_@0msX?vFc ׏D gҫ){׫C:Pլ'og~q|?d,>ݶ*5yӨ(}M[XtJj Be,'{266ybЭܺ&Ql)2{ SV'.7g +.i{B(s%%ҭ*.ZJ@> oDP' wZ{d"\,rPgħ2 }%QzSn|}f{j۾k߹$ڈDҡ)F*7aF(ZY3u Z'r=8y<*E)s}HFDKµ%p 3Bs {:7Џ]o B)X򞸂",j@ QZ;GEձ?0KҸM[۰&lpM6.$ըDZYhm5MtDMtD1R+ZًiVb344NtRB)FRrT\bs -ۖ6"hl078_8ѵc 'f|/tX,,\u|ySkw/Hp;y|\M'gښȍ_[=Me'q9\Dky- {wlqQ#qJ:Z5'֮7N>ݛQM=[fͫkm]T]|exM"6|lKΑx2馂Au#)v$#]Fa03+-'~)U|2}]<Ë3GUb󨋛\7{d5>LwQF?~< r.)W \} *Z|e3KE]սNB^hn;'L4x7ЙL<3 j)¼IQ9@%PM׀fU:a;Ysfi@c!Ա1bYM)>H1Gt <x]0dPT@W6A&eUEf1P6(HնL^[DR92AEo:DWn-d4L_4L@}NG cLJhE*=xWXRIXWiz]۷&H)ǣ)Q[$D' M(Ea-k^a,l ʡ`IRIEރs?1pBaMP qf>ӏ[5{1r^hB.$ٰYE o!#:)!shOz;"ʳ{]O[amyW;zw+a?|U4bSxv]8gf\ م&@plKjg~V8RK/V N:9G$ZVhB1E6 hm|L|8m]&[&tz!h^ &g0Ȣc'#iJ31J˜k\ Q1YC`am87;jkGD_ImW_ma?~2͈Z@yPo@<\s<2SnCNdݫ?'5^uGXG}cnUf>E˽к5nG&wf ԒjRKsjRڛTdR)NtnRx:+[]c8<`j( #]EHA+7}Z#MLȬjT)H1%Y!9&wVm̆B֓C>d1V HSg0)d9_enMgÍyп fգ6mjfŧ>+ d%e/MW21kX|CEꄽ3Exν"G<[ı9}EC, Ggg3r&r1 lls ')IvGGd$9Ge f Αچȼ8Xu+!;"|Hݒz>&H%,&Cwu#BBm2&(R`ΎX֘\R"ki!/֢1ko4ŧierPI *  MDa%u>t+j>jE3T2qt&'{f‎3qO= F*o?ळsZ*\{OX흫GYLAHdR% J&qX{/ ύ''l\L62it!p43[3,L2B1 b^Ttfw7:M֠ v4^;U3N$l(1+=fjJE#J(N9i'+d"* 7gԆ-(*O0'bJE$ޙj:;NƴD1;}Q:FmQ`7r6tJQ"H*k4L7t R-U@v ): -C ̢εe1*"D*H"16lÖWCw%""v#b%0e"VV6:퓎(yREV%Ċ w90gͺfdA3brHˌ3%f f{ZMg;"ޞg~Ցqq(Tήd_\4qōyii62H㓏=(e]TO+ğTrq6YvA1CdaJ:C1vuhȝ6̞Q;گm ZY lLM"J,K>VȊXGQ"dJZ2v6J|gu,_}jǒMYC6ڒxeu'|YGMߴ~GQo&9KK\w-=w}%@iu{wX)ɸ*x^ac xRc֙r~EUW8lZC\^/uH#Ac7F =00c$U1-Y{D*ĒFʵRʚBD}C' HGYP Weʖ|:߮Ynuo-vp2̈6Z Ƒ7ND@^{Xa09-|MXhb2 6E\ʉF`dQe{x&WCZ<H )8!dfWEAX" H5m8K$U=t0r^A+#Ǜtɖ-ش F2x4IQ4-GC5MIFQnOIvd/uU獾/4I/{WƱ@<{7C}s`ASbD }gđdr$6 XGU4q1.)뇳 URF!Nh0\q&ڱNJ$Ĩةk"C5Oob _o¸؝~F0.(^QS] e{_y̜<ƨχTϞ=(u2.sV~=?'QWӿC{ dA)2!<(sTH|r'-x" ]1ٳ0qSCKOg %Hw^#u1do_=9?S[#$tN M>Ȉ r%s;#x>pɏbiRFTK`D"D1'E@0!T:T+Ϩz; | |Wd,u6z(#dh-GGshV6ad돧5DAM4L{0M&qLI֬r;!I86y&2L>pvO+ҩtSgWG٬gg497QP|L83O_|gܦp՟WTCgL#`i),ϒ͇m3Uu'E2y&K9!]6+5" |/olָ|c([JS=ukW.*zO5yZ>SF Οq-GԙSkrR8O}],(OޝuHݿf`FOZ]tZ]zۡZ1up>_iWtTGvk(gA;ߘQLTJ4}sqMCS@`/^;n6%;kк֠(=[uX41uLEi{ OaL%e)!-YuZ1Ώ I|{^H-ŵNN9;A#$ E0&F؛0U '$TkEq> Vnmm|I5Ņ[/9 2Y;ܞ}@䂶~z|m6vрmP9  @R4+nc,ͮUFA7YWpP\1+u1YJRG ׌; S5ݹ@&.MZ-Z7N{tc 繥Y[W>o5"g20&D[+[B<5@\OwiM4~9yd} jLݛVj>X7ߵ08lgVإR+Ny *B]4GLJ$ >@x(*?^vWU9{OћI_`x ~5~d6#W{^<ۈZiͨj1EI@(Kd68%93USu+hZe!xv8Li遲UZW]VVA(w5ȳaae3J0ޚ>( Yn][EktYd 3 Q%|@Ջm2,Ob/徼F2Fddz%E0q}^!1g zn۝wal1 R3Ey„&zG;[&tWἴfZ'DZ{vBh5C!f(*68R$#Z{1>1V/t |UGT'I{_/ AV}8 E6z Cyzɲo R{ܥi꫹yWEqX5qW//!~bwaG~,:ke,:u~Oqc`ҥ0r9ʠvB]WW׀L-k.PHb1@1gz4"SFZ`+Y0{,+I2SyT-Ehw~13'|GJxW8[`ܡ%ur0wa79 y{W[4Ag1eR^6q.Cケ(\a=Otf%#[\(=3TaEeH)$yyT:`+:<ȧ`.mh2%#)SP["9̰/Vc  0B'LM})? q(B#A*Yo>t)l8HKN n=U1MIHs{&[-Yigk #!x IiXR + U48-(J0 ͡|t1X,&]FZ&oވÍE4InJy" C jIDK8]kk-,E K|ZsnKgVfdk]݇gZ4]2%Ficfo_NLa&OCV,ܣ#@Upf'ì:?@`$(V$#aHuS1Q8h :˦aP& -,i*&WCc3r[q(0 cq8":v4\d"erR/>ʱ`HI:VtU0<0=}00y-UՓ yRǴ|] ƑUp),Cqʔ+["%Wt inf˛#R>QL_Lz4ow_9j Kz]N6WMZɌ44őcF.]uF$(&ґȜҎ[3?J *w(s+(C)4y0tÓQ1Klwp6Ω,~R~KwzEU 0ɿ@H7?zs:zћ?>/qp1 7"@ /Ou[MC{ӦbhZ6{]mUmvݟ D;Q0@fr'@/ZTW^&fM|F"u$& eK14#T#Kʐ #g; Sᒗ6\{ bB b#3d̠F).vxlxl*{^SįbښZ6N/B$)Ͳ^)(70Iز~OZ0z sZD)æesk,Q#мwQ6D+o!ښ2\F(KAUu<1c!+-'8XC9&YG92t ȀRA-EbD hl;@k .KmA歨.]:G'o?n%Y AIǭCR( ~97c x'B"sl&Fs&͡bm<Խ&C-:%-nB.AHLsL bE칊 y *:x8*1*ō8`1wa8F[ :aFh [s36!5|yПK. *}> ۜ]o[0UEëW}Sݙq'UKϽ g"[="g\!JUr(zk\tAZV,5BR, 2ꁽ.*$0y *[K*Ljd<)V2hR6Hc"-ȹ/8@(.;C۝9*>!n`q/<[_M|1OJfQfnx].vyү୰(zS뫥2j##gIKʭ%社k5y9ЍRFbhڌԻ`bud-п-d[h=w̱JA%z (z=cA\ρB$Q9d=]2BP$'e^3?ɛAwKk LYX]cJj;e1͂3YD4_Ee4h`p): *Ĉ:u+^rbBI'hx=brjUkId]f\f%NIy|:zF޵%)ɣ_2YjNLPBΕfw& J-.Vc-ZΩ,6ݾ\u/UIU> Nus6|:5_?+"L!wx{͏_v܋TwqrU~G7%wҸ`针<:{|r1c T\bJAVxQV:>ET4r1GG>|_LDm"y Qv`Mv\cKvjdгVrgwȒޅ'd plݍJY] A9Asn 8f*˝7gu־=nKt@Nwl &9 2G$US;rVpm DgX!fR竐xF%Ks2CABBƠCNjLE1SQ,Ɣ&sA:tY֜NC4O-@SgןК 'CrE֐JK)ɲr)#Op9SI%`y5l! +sBG@Ƈd1`6=Bʦ}K#$J@*i*eY$t/lbIO[3h k_`3֜>,c@P VCslp^L߳!Ogx1:=0@qWz˿uq#8p#bKDvxT1`;== I6jI(,h"jҊ$>YSv'e#~k3XJ b\a;;W!yoll/Z粲A_!u:F6"ѱPLNBFYJX +RC.xm#sWqLݮ gmlwgCҰHEJ>z0U|i\7bΜ1&>U #Z$^Z&1p, eX= iu&/T):c>h@;ODif\DV2 "! %<= uYgM`=d 5w }Z?4Hp4IeuL2%)Ce*i!R)fTb-,kL[\ 6$ g˧<$4IIAJ5JmQ9'j:>ቖlMmMd4QU_1ƳdXӊm)4TFq0'-V"1ץ0c^"`b`:fKʥhr& ]&՚qjXXmd슅P ½bJYakn07$mY `2g|qaHlDDineb4McfLr4'M hsDMPĠ+ZبKPoflh!*YЦQGlGl?KPP2jc=ݨH31Ǥ<:p*n4\1љTB"x.x#9B&fȑTt kbbA$ IK/1]T՚;Q_`4}+"ʈ({DqWEIeV#v֜ԿN=dT:k]qQUEb#I{ã# z- M"<ECNxŃ3Rkt슇2pV1820o2Q0 vo>+/ٵWwkdZnaʴkgBӧdeoty.Y&4RpG J L JE2ha\ [+ܤs+5KńLKph>EM&_ڂv L˔ &JT%FhĬ;N9{L0Mh"]AVXJ(kmoH߀E^?jLѻI*ȰVK/Xn=h Zu)r9#6]I>L2K=8٘xzC2\ɺd9)=&-CʎsdݢtHTyGf:x2]ۜ1g3}_[Sqr>!Kmx ":,sEW0hhG5NK&.H҂I8a/P=0p`bE'笎$X+ë|iFJlcɋp4/B7Ҫl|=\̚ݸwb:ꮯ*~ܚv5hG2*R'IZ{eL(fKL@]1!(@"!B ON_zBO<H`1x3qKFZLB g>vRG*S 2!xg}&=HE$ Yϝ`Jld<*t2䙲ܧެ?Ȍ3ɺJ-ٛġm)~{`[4_e }Zh'&M?*!R"g215192tʅcN̂F:t0~{ /oXP:+z'>(K/42 0Yf c4׻HX/iz4LcC9Q~i'G޻I;svdǗ o+kƨY=̮|gA ϩ:u{AՄ 6ϟ721Ab0;l2[4dԄF̦5m1BYsB5~u $8|G߾pݰtFUۺȿ xvg|zϛqót?ޭ[`NjM2NxqŎхaTOP,Q[ތ_4/6Q(ogn6i3H Jw^,86iSl $y;z͗/5^ͷgtQY z2x~qL÷]scl8=j~ڧ?|fβln&iz{2,#y]3oNۉ/_nkZ& &%9ov3I}koݛbțMžg׍rapbCۍ̆>=)oKV=wi\\W6vrVfו=Y^\i8^ʕfC1GUtVq0M7dZb?E~̜TYc{6\I,h{:fB0Mj7h.śkbɖ:W.eD I 54/}xoq^VFe&Ƃ018(E  _/쪸ة-rIF؉wZvi2c*6/<.0qF%$I`'&ku1$uDPQ)S%hFb]V;&U6'f>? J\9 VyP<9 Ze\HuP>j57B֡2&[*WF`$MN%*T3UIltE(O=]{lF)Eիҙf^e#eƧ4v~45t>' OfE?yAW&e]1S7ŘᛟJ~tJ0ؿ?;VsΦwkiJkb@-+Mͳn^iΝ̙7sW9҅B}rҽf誗œȗ-@9{aqr]G͋٥Y$-08LI6@6df. >mrK+k{ݯzjܶ60[dbXʪO'(S:-iVeЃLoŤ|O^ngJgn~ܬsoI/n7c,H}yp?Kϑ(@#df]wayM\ YLrJ͙"6.4!Xjq=wځVebk U cJHm <,Q60Q΄x.3Zi3J! de()-l>]/V \݇vIWk*Eh_wf;iO4tz_yUu/e[tnmkɶmK֫%E걲 W2 QRN0CYuQ9W2:W⢍\X1T:3DO=ʠ:UnjW;Cz XVQR*j|sWL#,R%ijk=港f5biI/%yV 0 dcw-C4|^IhF`ś9ro UP3űD mWvvR-}*9v nW ڮJ %͠D-CǾ]%*E]}=qq~x\3,5>wa`LP?w3{J/̌Y+LVm~r Yuv|ڕNsIʙf8cXmK=K% _{&f~co`o,򉸼ݧ+?ۥӌ$ɦnjItגJ nn]i,{DCN5٣;yja?eHb)|О"&7gr J.z-V%CxWz߇IP& Oӧdég 9`$LeG}t]{!z2DV$w?w2!b}/PmKG3oԒ YkKjI(h39I̠χ,%'K:4;[^9W!~) ^C6ViF2d1p09k=h ~׎ދ\^٠ޏZJ$*D܀-weT.{61û3E:.2X x NbfK" C)U}6 !\G#Mn}TW/,P@vC3FhIuA]xTwA]PzsJxHݑF\ޡ.( Jһ.( JҿFG=_yhmKlPp0EN93X[euOuKU2C;zP>v idpLS#Q%Vk=F:!im AbZ5H9DŽRn&x92J#"2qGkZ vٞk.kƁשW>|>-@V44S%Z;JЇRdЧ*Jj|R3/޵w!$1&(k@͙Yw10sf/@@(¤6Cз~^saע}(\%}bg 31-<2*21|(VѡwU(!c@Ueqc*h%{gqoR0#4\-C٥}/ć!hƳZX`fOB+dSYmN};y;94ڜHzT~eW«L [ǫ],\U-XP|P*%X`T9F$㑅H>HԔD b FрG1alkV=%[xxwBAso.X6׶ڦZiL/wv*>čWJ\7E+,2R-FX f*SGgIm2*-"U vh4BkM̀ǝQ@k0*GT K@ Vش,]⬯@46Tq]<{mx `S,_ɲZt39 #Wr39QfrRf7x3 xd.?_/ڸw By+X΂"ld`k;GW]Gz睋 ‚QRcEz0XE.`Hx㜤ĭ2(Ba S X1c2b=6lXj4BZ"Z6[#gGGW7q|h]b: M+ 3N8Rdj uW\7+O:Ox6=_-MZxi٦SZ?ۙȍMA4QL۶R\L&isle*DoOoۛ,&[}nͲAx37[wov>jykceWfb̻L뀽4# ["'`MsDU:SXa:8e0-qiZ5rEZL61DJ;f2RʃprZ4Eg-֭i=MȺ&%trP)05.6@h{ӥ-38iX (yD}zNQE1])*@@{ːyyT:"s%^r7Ih" HitL*"Q* J3K`@q% FՖ~;^O5tQzcl=&F#Ԝks/qY-Ջ]rdŰfZ&<;O ?0z"'t iaefBC1,'4Il'>^ lM6US,潖h{FS%}ԟ!||6t~6yb(W2p|=d&Z$)4Sy8.LB6YNpg_E8s%wI8ӏߟ믟~<9&g|z;Xip)a# $IuZ57*Cג9zwW9~/0li+q׾*F|iKYl)#gYe!z-U4Wa#Sȓ [FtYA 0eqĝxI2Yg#avmzv`DNpQJeaGZa6HX+`163L3IތȾ3+~,E?)]kr>0)';GC<t Y њh`] OkA!{!EхW4rD "6R@+&MD{E=Ħ H֐m7Cʔ^/Tj4vCP q`:-RLp v*fh8#  ߸S@+s !ss&oA3PQE`g)c L@j/1$x$. KKTDm$%]3|%g9h$If,OIͥ#f(cMQꮚNy0ek,k3 ^b /aؾWV0GWelA|ې[Qv{l %x ۿehV/V \s݇^sqU oXGE#PHw5 *lڕ mk搘G/ML}Z>u&$Ɗ$5 b4Ih K Jrg ؐ`\UܤR\HNt3ikU*qu%օd斞Ş$$nIv[DV*Eom41L{ìJ]$7 {b֊@WD.Ikmbd_(pmn^y&L1--CTJ6.;.)#Fm-ii¬0DI:5i@vT~W1A %1B* : v%^;!bD/چLɚ8r&VWrCo-CdVr ,9"M4c;kK(gr"%)*&&ǘCq4 ؘI6*d\mdbs>HFj 4R uPVBcFBYb?]. +MOpڼ Z? &_8b,J&0 Rep#51Ͳr8i*LD @PȞDCRؔ,(.L܎Y8DgLwfks=b0ۂFǺ Qz-+9U`193R Xr5oB'~AWJՄ"gxK]:cT'hN7G*8}TNrNG6ʂ{!6#pUU\K$ci7Ez_ō6OnBjԍw2h{ s?FrMfAOfV@$D`KϹқ6 Nd2۠OmْZS 2;`]lh sf9kaqQV֍TK.t*ʯڦ3(Uҹ A W9D46f[v%GDFzD)r *Xm"yPzzxI`:y(AZThbȶv>j|lkiگ]YwT%dⱈ>@ی%$Rk.RWM(˖ט}{KB9i,(ϒd \RS{a/U `moU1*XK.9gBf}jQ )|:/+g{Z#b]wt q*NJ\,TJQ}%G\17/gf>H'H~bRic 6d'A911zmxd>ݤMds}'4旟IM:?s2%."DJ aO? bJ8QȂtL G B˗z`z*rkDH;T9b:d@cơ%(GȢ*ECYDgIO'dUu0z]nnvPo3Z!mgxhqQw H80;MNGwΞnٮ>j~P2g _:9qw:)1ÃOI{yAl,hu-è?`<;mZF e+Nۏv/ v?6MQ^15v|Q]W]-wax??ޮ4[ɼn!$Z#b3Go,Q;G1 'RP4w? Oɰ)!s3?NؤaL`؈'@ ъ΀fuWUxޢYZG'7jێi8sۋ)ncj#Rn_L&ggifwҼ:H, 㝁>{@N{`YۓaɧUn;qUJ\$@l,">d)lr -;-6q[4y{U؞ 6ࣼ3!͆EQ.WH4瓼]{wj6\sϋm6p4>8=Y 4+g2YAVkR/nFm͔wvǂ09 +4޳vSwY1/BXWZ]i d"l1ĈI,E= Dڅ~Ťd<|SiVW<ȆLieY!BlDz 0ܖ @d=U21n'h[byuU~怊vc6kgR?t#)Yaټ<]z=W露xh-Ӷe挀PT#D:80\ByЭꇷfm:je!hm^!)ij!b( ޗ4ds'{W:apmblB|b^Jj|-xVzF0dDçzk| R*%$mTJ!8Uhd#[EygۗDcfoE|3W VV셜K۞Qrof3e!9FƱ|tZM}X肝4OziO~9tt8R bDK8N\ygf'ӹLJ"dI< ;=릿 ,އN]R:P&BJ[)A:i'/ܥ-l[a3j^͋n!|wveW:~mp2]z@݋ge'ε{ 5tICy?:Y5H._; sz峋n f:yhXNVflP}tko7S_ =n/ NCyY~ۗ5Ν)D 7azxۜ|Y[dލR4xm#f{]`} vC|>-:aI4voow~bg[[SgRWtnt! / ۆ4f['֔֝}s,礦_~6s8 ߹6%}(j󸙼=8:>Yc\,ݙ&n'o>|sޥց~mG9s>*j7*g62WBk-͉ RP"W\6'rQkRcy*pd< ?+uc2HZsрf4*YXX1tewWf}_V^>4`֓(1َ|21THadd#s*sRL1l\:e:QB%ـ0IP ȶem"9E!ZR"RrH]AAnL=6 mA18nÂTmޝɘږ0DEr{d K/W/}~q_s~#ݙ~K{E[Mu)i~bw,_`"_g7,_֧Mjgy7Wzr"!+oy|oVGwTX=<3Z~X /˭@‡q{]ONݷ,}+D3Rj֡mO͍3hQ' lQgWY5z!UJ1%|Z5w%@bcQ󊒽 #P@sT]q4%;yW .pnɲaqe>Pѣd,t"jV/\u2y&,m „Z+}R)41@r'OAk&e2e*ɉ=r*}xW%Ǝ~Z[==̼}U.r3/:KFee-F ֣Xl +$ms FjBQf%HB:K11nUֲ^MyWIQYKKBŒjGVJ=+B6X0S!c?x9=fזWֽrޝ1 JRSX*Ţ<%&ųHEA6%Sp1IJgc-M(HadxXkt[jY7k|+h ۢȐJx<*R[E:^ޛL.9>9`,L+Fl2,7:"./' T| %_MZOϣ?W@?,Sy˕g\~OX zI0&ɉ%~dv#d%'4-Z 3˳Wϫz^7p$@B]h-u G+6gGucU% }"35Zwx隟4ifyvr3羶8zr{#ҟѿ?~[^B^_]o~IkHIrhjQ,ruon đ&@o2׉^>孖 ɍ{Z=Y^w>-mqOW'loq9Mgtdll7#lѭg O=ͳ?BΑ:#mFma0mBf}'Xb]g׫=:lzu͙>=mnWYx^ی:4j]+z44Tfs2qu` >P+u5X5F>M!-/xF<>PP :ຒ!r@fA!?g?g@x]gPT!E imohNʪYdb6ږ%RDQ O[ dLR3@qA`P%HJ.%JݠoE.Oza;abƘъT-;p*<bQX$C3G.[2jp436hkI2O$Rl}&^Uo5d(MšrWaP BT|> @T#r kz>8lG{L)֫tK6璤:2/ ^җ2dD(`x=:6 ̋=⽜wng|w =XuU˟w_|5p:Ó.dCtSDE+tՉ? fc鞩{~Ѻg2m55Gup90Jk)RوL*i9H4T̬}3J).)F%"- ]Z7k&n8u ցi[o<3/(sO9?Ӿew lFAԻ<!r[F[^^&_M)> t9Iu>v. ^LC0f;/((j)UdS bL`$#8'PdB* wTXSBAWrI2"d!I.JYsȓw!L! 2qȸ4]Cf솂K;(Ny~nn9jmEmEW1qDz-=x7̢A&Uix5 Z5:Ň/ jvV3}w! Vac&[dքڒ]"1ĘlV(9VWmФ2Mlw"{*[ϻZqf]q#i\#|"%nӾGI(EN:\mREL{FBQ' uN7ԙ2sDɦ@[A>ކrmQ5j tG)Q{dm4$PN=\nɅ9!b--I#*jaϤa (.f,(R*ʘ!X3)X6+6g}`RϷZ (@yr`Qh\`8unx*ꧩUޝk3J* ";/ּb;*`%b!-JTBCTBJecӓcgT |NT C4K ֖8{d6*laq-ԍmm E>Ud||H{@_|_/dE 8ҵKtJY"ŗDRGɜ4$/p.k[@ =fSFP $\ W8ĔZoug糴A1b͎C640Z٘21 SP;2x;k I#YB=,hP3dFEGkOE4*+bjT`Iu7m{R_3}/5k6 o5?"hGx7E ]t$xV%(k\ C0m3gs㌠Y׌:YPڙB$Vle#3iV3^Bl]ͧ8-◓̯:]wK,9.vьvqIV#d .`pT(9ſnAWg5 v^-!SDg1Ι d@ȇ Dv $B( YBje6z-EJRJ\ JL:%emZމ)VNl4EP@.ZRHN2E*gj'6ODAޗF32]\v=GÓo.>Nk[c^C.qޗ?B?n-VwsʜzxX\S< 0hf}UQQt4fGvRy+t3%kmbJ`s__ZT9jHZu$H$BNQA:8x^WPeڱ)t[|͓vT}"|[G͟<֛'񖑻>wE N?TݬL%'sZJ~p(iiiPLr*nHt&.[Cdi#:ݪ4E2mwZWohs.QA<Џl!}<6W<.rΟpq-m14/m> )ct*NQ2Ӥ`h_BN:sbmIt zx2w#y" c@=@~ϗxہFiu*&0QjՁ,#mWTgj#m3b([R"::sYX 2d(0 `8 EҺф /ښ~L̲ڰYPI=Tk f3sTA0YhX={<! T,$JDMPz:^j̗,B:ȑU%1Ҽ J2D!tHU.Ң8*C(]2RQQzЋ*eI9+[VȌ)EfxH)Mj4FFS亏$WLgH^טg栛Jn?G9WT土:sp/ER9۠LnL )&]J;],t#Jv;utڃp $\Kr=MNwmGn%/Oק;;Б}plޚJ[e"^b*o>0g݇v:>6d^ħ>!thHpi0ANrl]5ܼ~>uW؊#ߖ@_۹;$S@5S,Dݠm ߵ[i>(@hkjaR>/eH\pFAf `hn)1H4o56B!NDY!$yƨ†;f)eJ\H&N0$Q&:Q5MLZb*Lv,fL.O, mge&5VrJ-a!`Qnh3/4&:-Y. =#7hE"I P삦67pB+.fa֠`c!g&%-^q$oL[KOTzkL0 "{5[H~}J @{"ԋ #0D@6AqП7H lY܅&6b~Mi7N&Ӷ#!c4)wEM^c.aT;rk{SĨH88N<i65a+!B`h e =V`^RNQa ‡)qL?MbaCZ-%|^ ɻ7/N,V$;Tۼj`"߳!T(h'p{y!;E^4@r ["lO=7}Y)z$agBz* Iő+8+,env}R3>-Ehbͮ? Dԭ@Qݚ]L  "kKu񸀫BW>cWF)PԌ@` 1# ;l.Dqt^F^o0)̬#RlP4(JY넆I+\[v,hmupF <`^;`|n' #? a`]-ڡ㇭kxw1ֹ!bZ1:v4٫gt_ƣtS3M1Fn3)[ۙKUdqg A"gNXE"O$ڝVxWӊ^Σʊ?Xo_$h}˾ w/OYӉaZ>nu+7Aߦ3-n'Ʀ(|N9AJ TE??3ǨȦ2XtXg<<}A'ُnbЬ=Q@&g^7=>{f䐁||EJVx:fyNNvg4) 5 Tdw:~~Y^F/}(̸3jeL=oȔf2dy9yҩt)\BU$'@!/24?tM7}vn>FF=Şjt}|0y610ǫGy o3d΋-0yӫa,LF`Yq7{Un;?Dۄd^9XqRsHaMr*d8_ZZoQD'3LF"H.$8qB~l§zX^Uy%_֫7UUjj'!zrnn-׫;*Ɋ*9**(3b`?<0b7γ!HÏ*i~*udz#׉R]s _;ԫa GM 3[2>k>0%_ܢ/2G!UYTU]#pä sjBʑzs#Vc]ۉ_}P"74ꣁq)D}qPdG,m{TJ&$ ֭Wr2zkfZ-PY"*djڡXwxCe y4zZ2Iݝe_1@U{lʠ_oV5-kX*c\IJ[55kuF"0t7M O(=Ya2ʥ^z0 L0E9eҸB _˹pHs ɋO' xv;_椐 u8I%Qnl bXބIY$qR7B9g0]ISol*՜*#jԈ w^Vp qIQJ)tRhg

+ o[_Q\Erq*R9`7jYs!b3͟Už{-W3Y+{5gKvGXxw޿{"@.TQ-J`JA-1c3%mOtAw!0| )unq+&߯KY\At57~(m̋lWVxi5%ڼ "kY"Mw`^B(p]tñE F{'6t,Ǹa:XvG tUZIܭD?&w;rXMv 0; MB#&cM${$}W\E\Ej9uqz:*CԳ? S;?ua D9q'$v-]WۅH3osA 3kh7;BVA}8Yydfrv }3!oy !5!e%5(XIDZn* Le3y CRQ,L' LIQ帧!Km3IJ*gC*{qu !>o"ٙF4*SV;Ec"{52X3J@}ԝ8@α\{iɳˮł]]!|:Za0E({Y } -N'Y QEN{8z0Y;F/6beͅb`EEr5,bvE*I:>E,FiL{=#X-b܄Y܄Ҩ:| e)ZZcDWb{^2WZli_~aڀ :Z8!_ƥ\zy:cknj?Ժ:KmW0]5 -=e'}-!:/um,_v+η&o67\DI=w\74gJŬB().gtrgTfw w騚6U[wxU v|eX.c7cyRtu |~Q^>'w9 6Rbnj7Z:Fq0\\i&D5Fs0z_f2ZCu ,_TeZ-*څYXQWds.DdvhCNᬵz/X<헑unI;hW@ahΰX۞[8yשWKƙ^R4u7FF\>wLdXz {Nz~)]B֛{*YߌJ,ATl Y7zZko?b8|. C?rThd,9+C,/9F斠"{Ņҏ!4wpJ$߇kh TY2Y/~Nk?'srY\7u^ukݪnګYhwk}ܯ%is:r 5/b10A*G3|æ\/Y]\ܪR|e$z^E.L^RyɡM aa'`},Oo-t=Zz*2Pv.{p&Z==yzTO^[_HhebD#1\C?@LhJ+X=LJn>(mx JSƧ-{l{w#ުhq5 HX4^S5>tEX?l/ɗi-L ͝dz\&x(n&n4oY͉IUeYZ\jwh6,vt3mIr~n'[n{{Cwtm~[@QmU]$z` 4S.w5,)6,P rxL~,Bgk<&)ܥ{ OaL%e 1T2Lb0/3H|{A-]p*!`M.P CUwV;ua I u8I%Q(r16J'ބIY$q%wBvy#_Tvk{cŤ4i5=CyJ ^VoL84#[*WW[cBۓgye|ڳ{U£H}Vɱ7[ɶ: N C!fH Y@SJx@U`>Փ`Mhr k2ކl0Jg>߄o: ?ÉC .l۰ȥMz7G +Ѿj{ڥ=Ҫinoqbப.÷Zk{ۑ~f]I>eiqŋJŤNjRRnsOKxw-ڧo$~KJvTC\(wp)B@ƌƁZ1b"i@)m4N5lG5|9ˎ++U_$X@ni-k{fL)AL֡^)tXWg/K `.B"<̵Cb9s'JJˏ0"+XlsC%.^#9;+ ëτW)u[ūwQU;UѤ7n4??&`,|7ӛ:7>jCxo 2e~E&,2R`+Zb-0T1r&UJȨF5xbV!vA(u-zEY>f"ڿ ^O6YQ(:/seYP$W\$\aepi9rkEaA({M"RN P(V )8'Lv5T #Q0:o@VD佖豉 ihwFΞc% m;Nᔍ- bѕ*2}$EkÁN!Dw0^r7/j{;-)TD2'Tfi%1#Ay$vgASPɿ1 Cv|UYNSR#8p:%'v7.$h$9=V-vY0tnZG>A3$aH)0%D!91,6`1RtHǽAi$ 0#)&,ԒH L6)%tK}2ͤ(IO{oa^YR پ.c4If*zb=a6eJK\fCY mj;ZuIمyV™|e}̪lO.4Tduś]`@2@SD.Z J1u(<\Swa_3BKT6I=7p|Fuɘ̩nU͓i ~,MTVOޖ^Oo,hđsgqп֖.Gh o0|jƑ<]0yY,i}J+V1ab]mryT:dӨs%ﻹHCsJ?f?y=? ?-&۲ߜl T2?JVĪ2 ]4R`"])4i4qiEF1K¬/p6~q|[v?q`?oLx}wz+L՛ysk8 e\ʞH>_"@#x4u[Cxb{ -i|qUSnqA°%Q|)B. iK[*Yk,YN*x6Б-#^ |A3JE8Jo 2=rH_0.y=ˢF4@YDXQvt kwegL&X;VtΪJk{r(r(@cnvQpYtr:qۑmgaf{ cS m΢ԹL10 i9rڀqF)s|M@C,w鯘4e!8E=U,6dۍFQ{0r(]iㅑJmc}J!,E NIxێnݠo^G{8S7P>B89(h4ZGa+8+N8kfu R{No!H))jN4Fz6m@*p-2\*밊%%VJ!%bkISn􅕲eWe}pa)n4y8`OE/#` UI8\5VR;Q,a |QR&(pD9% 9sTJ L50JH3mSĂO mк*kn[1"ϗ寳-N^88JPKż._o*ϸ~,<`fO0|gz?7˪Ur,ˠ"E1{!?'u<} g㺑3WXGGh*T6H"7ٛ[)|J,S$CIvR߷{HF2).yx o7? /Wwuy[;_|~qxZuVdVX`w#H1h,v6>ЉSʮ(1{R+o\ԨBH,>"D3B'x]+ɗWO/FȐLSUύ#V)q F!Y=bF_MYBSv : bRh`iAauՉY!7aul5%rLN B$C38!GJ"dr"HwdҲV(l4[2Cٗp&~|NJdt0,.YRԑ#gXp֖J>EQZ|L,:P gF@7%$SH)mE#br=P˛Q!2K~:W i6'{ـlS\mb˴uunQ*;zID-TEgiJCV6B8#UuYƢ9@J=%aBd\V2J P RKq19;ĝ/f,zƾ, Pռ}ϖOK{q2}Wbw..>^nsbǪ8͗뎐518M4LsŠjF$p$x/)DQiԖ,(S>J@_xbk*U$Kou,%ĎWylڋ} SUUtJQ*\!r39or_{( a5օH ,C E']JN."YUkU"\MBX6N=ߜ`lƧ_KD0 W{%_wvO:N&EIoWe Pr1L~Q"zu@q1iY'uv*s&昐cґǙy &{iX6?-ԑzZRd_.څhD;d-bDΡ O>A)%+y4(D[jdT+fR/;[n?amy#w*6xGQO;pn~d5ⲑ '^vCӖ/iS@q tb4JL99yXdMBPS &"]XHO4<=8!XڐK1#&\Χպ>XP XɏM`4k@P!㥕:]3c`9`idJQ$aE02h(ZH1:*|1)+oDU ɜ01xl"|Zۻ`1# {F OmU'՗? S_Y sGZ^:y_b2!qR4vCK|@Ȋ1S,hTsjʗ*,_|i:>9UZP.KwQhERͲFJY; %K_L_dr'?lg/{jLUc8XGNQABwSE*KAVkeUY%QEtGB,&U{hWr W+/#\`'npEjoWRWzǦׯ޸j`%^:up\\R+őy*ucΠ+=pwjɷZpZz W8g¼6g(蟳?ן~<4ӧyAs8&iotM6%~ 5x{~~Pgpo.iЏ/+ w ,P|<,{N?\}zr?pǗ(?W!sl5tЩSw:N n{O;Ũ`jln`hxۋjhn`pvRH0~prWq*VNWdc; H\ZZTvqZoH XZlW h]]`* XmTcqeu ; F=yjubF\ Nd]`kDAVljS 6u+{hyjAVi ʃ] zBs^npjubp;6=ymؚ;&g˿YȂkznlGΓG,(y*Uc齛 ~𒇰zEycX'OtMh"@MdNo͌>y?NuvZMSP'HOD?UeTL+óVao>o.VS$Nz[ү}cc,XH0\_O?7dJG"Cq~܌DM1+jwD0+9,h6grӲ:ZpIk'N>:qPsRnIy4$XkՍGr_dz<Vd ܊x4X˵\Z[$wuq'DprA+Vk\bvqV!pEn+V{0pu"OuvEe\e{:* Z'\`!kYjgZa]" v\\4ԚC>X6W'+tʢW$ة~vYdU;pu=W,#gZ Xk>UX~pevlzJǭ B-=rL=n#$€y2os;aUZ̎GOrn@˜i;3Gث2#GSktrJX3n0nط% TGbɕJ+Vu\D% Jja~`ku XqE*W'+VpłW,\Z߼3H*I pEnpEjl޺bn8+=[,Z:X%M #\`mdzR;ú:E\9pEc]\{B+V) uOK,ץpErA^pjUeJ3ڿ\޾B!'{}\/ S SÕ+;pw[mՓRAR:dxɌvdG+A{ZXA~.R1"h/~0ZO}Y]ֺcQYrp"X3J +ȓc(XƓ#4ɱJÓ;AON : \ܣa4+V JK%%t+t+ \ZU) $xa: ~XZhWrJHT 'qY.^pEjpUvsEpe4u+l]?8):X%Su=|ѥ{ H\C{4W+2ig{ HSSkHnkmlO4$v3rA+VR;D5pN5x`#BQqA ͎C@دЍ{|69S~1ш_AyI!ɷ!aU\>tJL$/A:`VkuFfȨLTA$z ;[_o%)z 4sP{SޜhruSpᄏDA uu3_ۃ\V ]Ms7+<m)R!R3a&ű](.f3 xȗBVPO0p"9x8H%` ZD!ۂ݃ߟZaq%{Ӥ\7 L_\Jsy7<^ ٴ3̝MKm`畆e>ֈ0 asJcF0ac.S,k΄qhI5.Q\-{GdRIVzo%_Zd%2~4GⳄCrz%$R-Bm#k Ss#t@(Й$tK3D&QZ[Z3q8-scKucEG*UӓZr6'@HE[3 Ǐ.6FA{)[e\{m~<5TqR c??cJvL4VU𹍑xT#9j0Τ>fydDT!#n@#Ag~"l\JGARP,)%J!{0WcOEvj2(kC9Jj mdF !dglMc`Ӛ|XņX6wO9u 5539|1!V`-c k5TZ $ڙ扡/ " }u1D@h,֓֯'2scH ,ΧhGTkrs`…#H #k-ġ;T{|p.؁-`N֙ EJ( )K\[  =Eg;9h-]}YqHU AR[CXbC9]dp-C^ [Ficm %ֲ YWe0y+#b YZzeS`("Xx>p՚膭R̼5Ԡ]ETS6н[ 2 (j])J5x XS#4o|G:T%0V d$P&t41& d>JAQi Jo@eFo|#cfwiPB]ځ6BoTzJdܡaS`b ch!$(   HeF$ ՙj |JuZc$"XYH\uavLq b R'UfBa@J"gD 6"l3y6B@8JoޛZD]`)-A)0:d-RX6[gZANbA+f6ij\ 1 `zm}~u>sw^^ [ŪG=.0dmFV&3x:p2 P d9yt56R ڦ:kx˓"]DOeahUu#V24MTLF輮b>TNc偹a^RΌ+.C&OP΁G㭐!B@8rǷ>WUS]ߙ^Zź cʶMZ WBNC/ua16KZf%MJi:XqonBfȈ2ǡA]ZE5XϐxÙ%* ^RPGQa^L)X3OUD(U8g8Z.fcvl( BxJ5Tnjw ,Mі$CF2z~h-rx3L2БD{잗8Ϊ$12YS(-Jd~ȃ 8D8"5xޜX8Va);L6`E2àD5.ՐtX;wHWY g&k$,ά)6Rj [3W.X i)퐄x}A擷KP03o [ Ce,c||z}￀/psn$yԸy0|p>c[-CgkFل65e#ཱི%bqաcXsu^kNbݨ<[ 2Ę +@9 IiaQC^vÇ4ts ]a7+:0W]r5+:T,-0 $WKJyDHoݮ7plk+4O XWHԺ"mXOE#GX)寺ɛ@ 'eޡpnڢ !t QӌH #,w=:H5-0g6)6<x hNϡ;hc[fkZZAH%_u}3l39IěPPJOk_2呍ޢϯF.[yx -5Y.ס@N^k (XV]`80h|/FWf7 M F;,`Q8bWO(8qy !nIr[1; iY!Wr\UKLX`TKEw,Ajr)*^GR!>/S,_)`ZX-. ;ti];j)2HL<z\]=[1"[Jqh{:) V'Gœ|}>Y6{46lp۳ol>b孤=`kh!/.ϻKz.l3؟~ys~`B替?={ͫᄈg| |o|o|},/onsµݥ?9p7;9&L;3gRKVI.$V'h@0OhH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 tI 㖒@ZC>{r5W}8^;Pu]zsyfKy\Qf.s\:ү;]Mq;t5~v]V9v(DmNo6DWӊy3t5Ѧe0Vjny-n&\r[KXj|@JWCWS[m6CW.mfjt>ifKꊽX\o7CW@K' U"7DWl ]M_am%h}D ҕyAl&\[+tw2?]'N<*8oLWUϵ]=-oCWCϞ_.?;{x])g5{SWx~}J΄όvnn_0 {IMgDg<'̧Ͽn6㛛ÅCe+Wޫy _lQ G˘.@?Jw_z2(r#_~}֚-\덽:yW9wEuwn AL?H|={c$dK[rv>0lH7L_7 oE7L.n(}PpB]p-m&Z&J"+r[ڕ6CWl ]M&ʘN(̇o&`]Mu5h&JNBA܆ cL ~ hcNb! #md+t5Ѧ߻H>W+Bl8Մ6w5Dt*| f7m&xfpdUWHWo'6]>`b3 h1NWSЕ36<7g;g[k?˜~ws~}uߝ pV)Q)s٧__~saz3q?aOwi~[{F%7{OpOϞRaly}kk0?jr~(o^O. o#&>i)J]k,JRۓ7m/h_Sr9tqS&ڤ/< gv~0{}S3n_Ns]#8yp>#ƣȞtu%v6DWmG ~+t5?]M*] ]y1.m 'Ʒ+B94PKk+.hr8.AXYRt$Y׶ToR;.5r7YdUfWS1Wȭ`Go2 ? BN]en9?vsm0͕4s~B*3ɘ̮>sy*sixkRFN\!G5Rs**s+]e.UkḬ1'c]N]en:vs\;5W\:2L]!`C*sɱ̥lW8iPByrsu;Wo'u;n]v\Ga-m՝dã{M)agΘjA *?;_Ÿ5Q}#Zor4R5\Yu-hq&*ᛍCVBPa~V8m~,jW -ԟ/Ⱦ?޽C:؛V]C!Fh)BQYNTwMyEi:k.fz2r1p6|\ h4?_o']QM U(1<9H1;Y(s?5ش/Nw6E_*k[޴Cfrgpzua5ϗ"^|ᵝ/\/Х/֖Qnv5U{ oLNzގDzA,IݧsqI<5XR\巭"Gt j^YaRVS/ (dWY6ΩggY-=%N;?ߜOG2y;83%.io3zQn',:8i^o:vI~N WI8㢫bvܬ? $;qƣ>vYR@ t?j9k2^UƁ)(UG'$:QNck% ?SJQ~9 Y^|4\-&\ٿO N.םc5>^B}kϳ//~Gxb<@?R{1 C',&u?'&qO(?&U|V(u1.乯krQXXOh$?~ȿ߼Wo7(/7?@?'|.Uuw`/b~ |_^|MS]c9ּQ'~!~ˆ5S/BeB ҎBRV_Ppe|X/GhV1}Ig#1cFM`ADJ9ɠ^LQ$;lțq؄6=IHlXҞߎ10_sΫ԰.UХLu)]]Z׷xw1p[rMG f*IV5E sATNY)3uO}ƹqSŢ~Carիŗ_I6A*bda&xw>y6k%k'~!gúa7{lvYsn{ ]7L\7@LW*Be^XWk;"~&`,:>E FqSH8\ ଅ=>q+ҜjϢ|>QLKO0DFa,B&Bm`QyWh!oBim"a;.Ox2r'9_l[6C.|(lz^džSMA B|rh#_&EITNJ d΀Bt5f_7rhERe6ra1r0Lw8eLz1J ۦKybf ފz<iW!o(ST䞸,(B?ſ?69j $ ^"+ QQL:+'ȱH#G\Ăb"*4Q`N%.yHRHRG%  FL &Ȣu,N'BntJ;Ɲs`XcBw-N'W6^;Jx*_\RŎrԶSݼk;ii(27V2W43@HϽGiEgm^ʼTYc~ҙ'(1b%l]`juQOsX@tS nlVe-Nv~Eyf ,rS>"5܄WBYS0 SdcH4&ܫ%CB9$dhiBF!d9GՓޮMqy O9+MnuXFjvS8#Dz(wG^VdL$ 0`B>E8ZERR 6yA 43>oiMsDgG/#MB+IF25^1ʩW2`&*tm-D+TO )h0Br(M\bH xup ^Ho*H{KU%cGciGѧ($y!i,iJK)[eye9}xӝEIG_+b%IK"'8pZ@"&JZa1H¿F_΂UYx4XM ȱr3ܾL{b"'tt~U]O̭~tURgxtW Wk|!ijb9A[V*(0蒅`  ^ڶ>Ѷ8=LG@r™IY %2 f!r5*&l ٻFn,W 4|5Xf`w2$2 Z-9 eamْLYUvnEUEs///\kJ=30PKwQO$сI` - +g>!%\$O/_9 7ޔÌMBqSr>;9ܑ9jEVGa`G/, C}=I&`subفS%|2 lE"*F|0Y_l\Na;H6M.5-12akx{;ahj8΄8<=!ظvOji| ~6?.G_qǎ]g;enX6 P+YC=V59TK#Il*cr_R=:+ Q4\Z֐9DbLGSύ8A%\i0"A(# 98ZC;e-p{D!Ck Ffζix^ۛܭ}jWH>%$&rNhLag;չ(i)pE-I[zNUn4-R3:&jTp^1•u(4" 9[##*(9fȸ$C[Sƒ~|IL K| Htz b>Z1C2&.`&;LcQ#8Q$Sy#8$q)_:j<4(KQ??H+چbdx(ߞ岢_SylxԚSFv$cN8c2)-w@(y#JU !/gq!!6ɖ*$ƠSB[J؄$Uyib\Xg˅0 ʅkɺo2>{~+˟Pp|7 .1kyRP Ӑ$4垢M=q`ʦ5dc4ce8 A1TBێh"xK/u礘 ףAklRc_FQ3=^-X(+;?`WQ0%sI9aRAHhGǚT,eqy1J-i^Ϝ/-2\Dlj8]L*_g^//¼({^y76p✦As-48 0tt,eO b~K}I+$iϋ֤E*;CU~|x +xz_')VOяG~IxvrTo3X9&( LPRkD $A:UGSY7R>@]A!oiWО@ȽLhW(a\rncGb'~KE96M8*ud8%:Vԣ7\]p>[=j)|;z0Y=LWc&~z=󘄘Y [O9W# 4ϋAs+Yjg3tAe'.}b"F E8882#*G9K ^4IUyth4~3 G]֋iޛ'2KۮG]4R,Z=m};WW"mDj՚H]CvZRhDOFzdIj"/'hP+<8AH)V#NuEtrdܲ&!-EfaxVvSvkr29EgVHE… 1@z#bKD?xsFkxE6`k*v- #ocp)6״4 G@NF1͎T;A ,!IO4x;g~#aBֿ$`1! m$=V@xJFZ`$AT~ͤ& tzqӇL^'*`7f81v)G&h( q4R3A!y7'#YDZ15*(.1BK^b>GМzGZG3 lürNQp|{;6 «øHN8Ax͉YFRDcxxr}zH==5eVB]isUaIP$q1ci$Ez)Љ`íz 8Ύ* zQhSkTFMI2A*0|tDFlSn;AaK cv[QB8d1DQin7Bz<"B=xUH8j%YJTqPYI:G<$ |0i|i<-Fyxz?[saG'*ph5fS$9fEr#`F'P-=߃Ia?녛M+C~XX*Y@~P6l*ۨ&{P$P|Y$aPN㫫˷ë KO3`\U?UR7̶ݺjnuvuy~Dx6SLo0ih9K6 P%5xcq/nܩg+*jVI8$E3tǫGWoSd U0r#xc@0 o&r@};2Jn7ڪm &bnEn(x8kN6zRȴRnEb%&ˈ_6E?a@?neWR\ CxYf,1STPߌS?L'^Ł!MUx\ݠlxhLp5qz~!S1.mxAϤ%[Ĭng? =>NYwʊAA`}=4(y+{Z E4&:h+T> RAfq@YqO[$^dPKx}",)%'yi .( 9<>u2ў^>x}sՖLFO, [UmN[uOwDiG4c2^\E?5iIG*2I3NI*s/xeg?٧|3/˨|zZnvIe(g+]ELȚ qLkp?{a Y,_d,m=6B緮|l3{{זJ 9C5_jjVW~Z\fYh:#}[<7kT% m7[wvZao>o&K9ˁ8Mϕ` pFr_DY#x] 9p'ADjιYRլM3gsxIv:=vovhGZ>϶r Rס#ur)\І÷,ME ?ygLD5?Ũ$Rȡ"hai)<1u>CNLsҘSQM/;VW*uAkوV0:z.|BID]h1ȝ "锨ăCi#)Zǽ99dߧf^]oUi KH=`KkXݙ5k80A@} QS65w @]!` 3t* ]e%#wHW`UI:DWp ]!ZIiAizzt8(+) ]ew2Zz(W=]\I%u WUFU QjǯG2@+L t.cqg ^NW]oؖMN `ep7ؾju\ -#8tJ2uv+M/@{oJj*j=zޕ&ztz%p ژp0]f0 «'_ӏ_UM׳[!6QSZ#&cpJ%)޿ފRr_(i{5ƿ-#v4S (W{خ>BHࡾ\ K^k{7_VWQcMjV#^ŸGr}0ӷxb;W'Y (ךY +^\|F^ߌhmTTE{xA.[e}ٜ `3\+IW,V[rޡ%DLU,xg*Õ+t*v(uOWL毰%]e]V^]et @ bvb;hv2S ){TO5ȕU>{0mh [o'6R!ǃH*-O^!9 wD G6svCˎdr%Jxm7J;CWP]VUFitOW2Rӄ@wet-vBwq$+ Ȓ[SuvA,F]-Ŕ(l{54M5]"Þ鷪N==5]+WN@`2f\A;ph+ Ư:B\Q`FZ`0@ø+`YqurQšܕ亨GKǕcĕ qW<+0市Jڸhw*(H jF%בWPf TcUT? 8qUT^zYjuPKT^wvz:s>CVF#}?Yj=P18GWnaWzYY- dRᑢGڜkgo2ݬR,V]Skt>|\I긹7ON~JOO.{FMa!';}Ï"Rt6dzk :GϲQ,.~~̏r# p/ H;}蛷=7ZwEy*ia~]Dx`1J8 DnУ jY٥JTWLj+p%i\\? DmX`iƠ<+4 DRoWǃ+1PW"+}0׏OOz@ޝ׭`H~ܥg}dy|N"_ybE6l)1ͩE}7'h7w;@\m䃆xwͶ?&nXByE)Z*2hLG40Rqճ7ª\5ʖٯ)sVѓ;w?O׷hژgm7{N\{okMOvƭOfx>mϏҚI85IzZz$*ٮ5LƐrv \AVa\\Gt\A+Wx#+q\֣JjY*/ڊ+2QE? XW"7 㮠T(q%GWW"W(V/Wriϖ^qUpun$\Aq\\ˣJ_$*Zqub(vqW" s5=cfʯ:B\AD0ap%rǹA.W2:F\EDn \Aq.d\(v2WŽCjv5O.xj~6\Svf*z;{k7G֞Bmh7+qg*f{ʙ_V1z[8dg֍Gqnn[@[q4^jJTr"Q*9QKJNTnlvVrS@h0__+Qq%*^qu( @JD-JTYqu"Gh \ਇFeFռt\J~wr-tj\A.aBF-WʰqQZőpƍDfJԆ_4VkWG{=?v傻 R9Nbz@ozsy"ݔ'w<>ǏB~z4__g_дȚ~n(~Ml;3sL=~eI7)_(/^޿A3i:x]+mMM?RsNmhLR+q" TĘceF0K&u5gՂ6'Bj)L OZj j 9Z(\ #j Q#wm{թm.{oU 6FKsYxj[,\0wVLt Y;\ciJM#$B"Sm;$T0176{ptPQ-^rJse/}DF0S$3oohuKue(sՌ{`u,5+K"LEt(ιc@K?#}4 4f9{ٻ0F%$Mq{k,ZadA̙=HTU.耽VikCvYw4F%F+ Xh |$rb3rSNc{5pQo0{y9xy۶; EeyaD)Qny}MŢ˳եcMm̭l%o8Yֽ@0#cdzEKl[(-.U#i~c#sVrP6PB kGU8(J*ZeDPɔM,>P)-h tW j@Ԝ&E!}AАVxp$OxFEtf8MJ*T[az@Dg3 dXm du 'Ә;>JͰjpoB+"81ukuDa)Vi,Lh =+FC(dԭ9`AŜG^' sG`a. b 5R$8TR 3zN`v :2]{Xi TtgvJQ"Hq[ŢA(0gI1H(`;:-?@!N) ] gbfYF`J ׭TfО,f^n!G%geKg;LUZDA e>&dYk|D d@RB6фƌ V*3&m723g?J/'uwj{1#.UUQV:E$' 'c ƪd5 !0!bE@O s#vvvߙx\dY,x*"oj{WЭ3e@AA-$ >:%:ppi;pM*)HaJdREbX|LE/XżsapaPMRuY+ QH.DO7~{J. O㐋=`d>YbM= %DAԜrt۠-^AJquj c@ 0=:X XiLEQ J{.ܖ)%lKRJcEX A5LCWXP\v%1m!zFF,Z=C̓pYcb?%g#|o. xc觖17{N)ҊzP#}kϩӝcvm\g 0,(&RPƀ)E+Y;nP\pYT2L! c=H>K4К5'7tvl谙:4gi&0 ֨ `f)4ͫA(U9u[zˬ Eaej,W039l@E}׌oN6K X[ҹdݦ6ۍkcy[`t4L~v>MU56 @][4qtU  G4T0 0#MKQcEuA$ZkpyjhhƸI@/|Pk>sr8n 7C|̘py%\D[9,B2aAt GRT"YZ# RЃavkdXhlFT [1L"\x>W+f+SNc4lgt+v*ĶB!DE HbQ<TCoATXD 4.1#dnІEB?zV"5R&vV 58֠P/㢴5byP`ҧ"pՔq'k[Hy~ EK>›Z#@KvGVupk n,~33#,+A-t1@+hB_?ܘ#"8=ZMfPO,[JMѓJ eKLZ6l!&Xӥ71.ń120+&E D.P,9)`$҈56X~VW$<Neg,"B P͵=V^ikqlX.k|# Z 2N;FO-D.9rÇϫFt)q:Lll"zsjq%X؍f)VE)L&ďٯ#תЍs.Ƌ/on.nl>.#.[_4{k++/l7o}AU!-ӇV3؉Q\F $-{-DRRF%,DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%+ViyHJ ,f4jZ5*֑@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H z@A (`JȴrHRF%)R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)^1o!)Z;%k0Vu*|$%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R(.#ފGo>L[i{꾼lׯ ~ϝJe(Χ'@# GGk{ԂKAt7 W`&\5s:pլ}YWw```U3 ?yjVn`@ W޿ev G+p8k Y~+p)\I&+_VE= 6?浄ʕVq=}Zӊ'ϯRB>ѿ07oFgҌ֜sV؈ciy4[ן-nzIw*_KEW̿LS+)~Z%G/j] ?8٥g>|o(/C YwV2)x!CQf]9ʰ 0XW3WC Z74+-׈3 W`&\5s9pk`k Wy')\5+fp˞^ORU3ة W03~(Y;f0^aL3Ivv>t>}.WV[͍ڃe6C`5l RRXBqpF=hY+JҀfi8qstgkRWc>Eat:ţ6NN;o"1r}G@ ~hq5ͷ7 |I;/]?${~wgXjO&kMߟox7}rnﮗlW|^׶@jH  ,YQfB)6f2[.GuɈUtށ_ݧ`O-& MVuhמ_'[?Ӛ?Z$.& %ڈ0*kX;qcwΎ Z9O[9 :[v_{Rce}2S}H,1gb]H;pN|t1^|O{dǣ}(nnAGvy:kv=(o0^\ۺ;qyNwKM;?8{rz,v?m%P$p[ۙ9Be/zN O:u)j;Fdry?=;=?zw%]( Qef@ Um{[.%?"K-K\NZ;'Ctj;yN/~/I=[l_ve;n>J慬ޗ\vkP^5IQHŃaY$#V;h}LfU4ݣ/mm4KƠFRB,.,֯NJݾuu뇬@TxUJ01jMƜյcَ'/,aPS;|{2#y{k{(`٧¸W%a e@!1ȢLb3eƐmoeV<-:~85hۏ5x$3ՠksɢV|) Y$`ML sR,X 5( zݚo[OisKˆ;qŧIY=76bm`E&np %^(1gne\xtNSɢU;bt ]4J+\P-ȇFJhT._ܓlos92%"j\;V%+N)ieʔ5C{? ̜$^= 9n=Åvr~/> d}v+['GBG/ϰly)yNhb@¼~V92JdՈ1[:H 4$\E[R.v!,* mEr.!\"W7{Dz Iz,ɬ7ۺ~7Ng-ɗfﷀ~2+|1lĺrղR58uTJ !UN \\ RU;8ssNR0=O]]׶SY+;3Ud l+/ii(RXjREYYVJے%l )Z 75Rꐤ%uފ8N%yϷF-~ k,uh`ܟpCE}ʸ3n-vGm޵lV}"5uNt~ >b 4{K/ډ2% .Mr/ygO|9]̦ E$:69beBuC~P Y0U6d6D` Yk`!(jI,CuRB)FWᘋRr؍9=WpQYXYC1h`Ŧb;C>P$NJ~|Ji2hp=M۔fF\OR\X\ZcJg4V-Ra1^0o"7է_x**bX!5Z9Z4t=P?PY X З~ٗZdZS5p;rK>Fmgm'F[޻N旮e?!edqNrxR1c{Z99֙bU\$׹ a5[\$rm=%\]X%nd\Psd  ]c8J ,j϶8oxN7,@ͿӪV_^Ѕdv10If4"1Xua2fX 5;.'-ŢOA 7F!E AhTgveW< 15ʒuC/u e;ΦybK;vjਭ(jɔ3sUQUp_KLkmI_?sXZ~Lf3d ,&0ik-KQNo5Iɒ,ʲDrL#񃤺]տR%pp0r!c2 C@v4H5> 'H TKK@Fc}9P4ey9ReNhtǮ,y';q EAj"*:,4*/)"Er@ZaLXMFҠ3"!Ɓ3 &$^2:q:4a^aXvH}klRU[E^3Q:Z%EѲ\\LSd^QA`jFT,yx] QI'(p'!&H[cWy([r7yx6⦘r,B-:ǧ~ԌgT9г)̜fj9b^9QX+b 3 1K^?MaۈBB,[JL.0"~ w>]'?BȜY!$yƨ𮥔)pOxNeRhZ_nRёEwłJs|[ѴW!cZ+[B<5@\y`:-1[+Hθz4e^G[BKOD. j?uSwa V:X9K,A1{ŅKw#1g>yp0yUz VE FaHM 6Aqc [$wUt|OL΃ 1A=f18Ξ\e|-V,nl>\:{BƏ†0 {GQN#4j,moT+#9DK8RPiQ'"Z^- N0<`P`c% h%Tk}d bчH !CzI>`K+Tw`t[Z);n^w8g+xk 2^%s()1NAcMQ.8O(g %N#xFDO$Ϸ&y8E @8֧d)ז!8rcH)7 6'!:ӊIpݸAZ+ :tGpXr<{ aX  j=6PEPE9^('=&t>qQ>/uZ4 }D=ƒi@dKx !~np#/F2Lu{k%HXRXbFwDqt^N?\){Z.e 'c;.F%߇>+ D"g AckOޕi;Oܸnj\4/+ͪfڪYhw7+mܮ$ihs:or lb10k~+o OxX3*dtRm5.J71Eb(;Lbh`@\+ͻ䊹}6[] 7**=*n6ָ|cFP=1s5]:h9]1?p]氮45q-GS+R3EyL0&}=V,=86vBj8RG=b~ס[1T)b+Rs#A29HysQ 9@;6̇z,SW_o~Rې Fi0̦\ ?ÎC Nl0ɥϮnMz7^G +Ѿj{ءЪilo# &-9|V{,lKgnz"(`H  eeҥ:te,:u~/yIib`f09sH){ O払AuB]wֳ\ 9`-@KZGK!hL8P=x:FY^1HqJ;ֺ՜yOzR8TOk~tX]=mnzO4Ӥ43M$i-̫B{aM8b Qc^xӌy>rtIj)qS zgx5Հw˓4#Qn*X02RjMxt\Lh0!Sb0jve*jI:q8z|Z#gˬXsizD.~B (bt=quv_]}tkaɗ,gᅥMg6neP\*uDȥ `< )q"SaGOBsC k.np]X1LGy}HwD@МTD2'TU8"` (O ({(? yXqp?=nU@SR#pIEcXݘ{֩eƦ"k{=nBոx)R("%=q[=]ڿ34N&k'.`[U31[StV=LkgizzQ_xGjN̕A[ov8 Gß-,5C'{b|}OWMݐn8kfq|a8*`ŴD}Mͽ zmUiв\f9K73d[f=@~1z>UU]~FnYH0ܒ&X?&. (fI F?675*`j?ng?&ɳwo.}?ͻg}{e\:j$AM¯{ oo~@׺YT-QM!7r a؂(qW!|L\4%e-eW5fWX%fM(+86ᢔ+<Ž""0am,2L3kmMGZ#;Fsspo[trY:'qۑ(1avn,y`zK<9P,J+t>`#OkN!)Ǒg'Z^#91"zAJIS,PI,AZ"!hCURy0r(iㅑJmc}J!LE NAx[}s?蛩Wut /(9p9ιDA௠F :KYpX4Ssz A:N9N R)=Qa=1F8HG͉H/5uT8EKeVѸd*"Sv#EDto"X[=hI|un3 JBۿUKH,U ¼O$՘݌'sRǓѧ~(=^ºz%˫MJɿ!Գh(|PHK>(i1AH\3^:Ǵ(XĞȐpQCj:=%Pq,z̝vX-"N)/3ø)a-ksZcRXwy0ۛm>b=55%80akm6_ezt-94&'Aj[Rq]lKlmx 9\rΣI b:k#zZǫ_Q5.`ylU 0>e FqSL8\ ଃ=>q+ҜjϢdoeO@Sr S0In2J ^0-d1q .DxІ D 9Q[ alkl2v֗g3œ{sz%69eߵ\4m>M^JnAA2qčʩQTLpRhNpck pSop(RĨh(UXj#?- A9)c+Zb@ضy9sJx8ueg_[|gmNQN|#R ]\^O6i $K^"+!QQLVNvc9Xs+ X>/[[ETC|t.}bA1I(d0Z<@)$DKvucTmJ&8Pcpr$:WPHF'Sκ[#Ҏukl7;i^0Kk[n-- Kz\B5!DŽ?ݞnj8P" xgg温ƽjg~_+_{A⦙Ԟӛ؆xc$ Wou)5qVK(^5C:wb<#"D7i1}- -[/fل{! (sNBS'[CG"/M> ;ikBLɚ;H_? q뷾{"\**Sޥ\P.H\!JWH.h~(sI8wqT2zJp0q8*+PU}&vW%GN\q%)1K ,Wx9:6^~TRQ0RG~V8S䈃b}=q x|F5F=}ZܙӴu4ϱkmG7'!1Hsb ]/w.GpM~{B֍ڑ\=aapٕ,oȴ!|馦ًcMSGedIu\ j>j~.3Вwۓ*Ns4 oĜJf_)p: CZ(MW*l@*cS4abJuxe.uK=e{tZ d tALN3E¨@,(  HD3N M G#uJo3c|4/=l Cυl [W} _}[o8l `jztxr)Ul[x QLKO1L@$BUs<(%8xI)mA.GfTΪ-Dm(&bZY_Ϸ_ Oh4%ɖlUoN>frѴy qe !>9_c`}8)j]r^ioY*u5˕Ywic͵{9>5]Kl;QY1 8&.x\MY"!(O9'Bnt*2lRJ L!v•[] kMiRDWMz=tΝX=u` xZL_KЊz=UUB>9}d{^c N(}9iq…4!&dK5k Hw=iXǃ/Ur$cJITz5E13xl~-PȬ_?5#sBY%SRfGl|x&?ih=6~vvWeW_S5'ɧZ% دp:bp[ MϏp9oZN]?Mg!ἄ)ZB֑]-[|v mϑ묉jl3u^r?2ck=?ws8=́#}nb [Vl=}WwZ!r[|Ec<qy>+X@Ǎ'b=c\U=ZN/㴻[lSm'nuKxP;0(U1S*)U#I,ZS]z(TZO ٧M@t,v-s;ƁZIRLSekK]Vu+WYiWRz-4,JFsQiH.xC^2ʩ"KI6& D87Zq<^SHAic  CAk/ .>by]i9j z\*.9#XZ|qSxwQuא0TROT b9brII]aJR)',PS1Q 2,p4kdtg1)O}rdv5U"JMk!8Cvst7rqF_iJhWWg2vJhX+bD墶 @8r^%D/%D/-D/&DAZk܎pNG3o$8ev 8Q`I W:r(kDh\9KeHEE _ygLD;GK?Bºm۫b9"~[2H αXuqza-'q63#FH7:z#ոhpnj\{/5h׸ʣTqָFDmCqGz楤pj6Jp:@UEVeNe hl0]ӫ8]{AO'|۹ :=gwaDM￷ʒZ[tXg = r8^5y|i6f WyB czp7e4o3"xkcm5wm~l:-<][l]y?x|j[.Ir) σ+rBy[F7ޭcofIӹBn>Jփs)!I%riJC#@$[t`;+ *ȵJȅ xw:9 G6dS&Т_oDj|l`Ze/ƥ=E] / JvL721WYȆcuܫ9ԭ("NĴYYgqpap㖄JN}uV"\I`*8:44" Yk5S-Z3VW"N_Τ u%>%yR!4l :%4bTrKJ#cFrJ1,,63NB^ yg covi\OS߄6 Fσ勍]]bkyRa;g5"\!"I)h=EN=q`{pmd12lr8 A1TBnGI D&J>(5ÈFðF1͎SQ 6tY{9U`{XhH J}*e$5Ih*`uԬ &hYŤW$iūE*5;NCUixx+X]Q7~RHQL>]*TZS4+jY3`D(=0"ALjMhL&ʌdXGhkӶՖuzG= 8=f|߿fsծjw =fr1 1Y [O9W# 4χ  rF#e]lc߉KHc@3CP \HIaGb% 9tņsx/4I6VX)S->dASKѲS8U nrdQo]!-solSq**mT`VS7m<US=S:ߊWr^J-q^s~_Nl(#EM+sf.xެP^Xx ,-1qm$ aR sLx knu {AJHΧϠcH8nAQ70$7wɯw A-RWi"uqf+I S*it_޸K4+KIt> s:Ҥ@`@xŴZn:t4bdten})6^.mO G$):_ F*b.,\!4mH?޾MԜiΐj08\iÊ 8;ێ߼hG],~FO43":@p*HJtOΦȯ\S_!q.b=8 Qh#(jZSOQ RB/X" u 퀡YDZ15*(.1BK^b,hVc9pۇΑVđ6tu;-޸\>=?ҧ)ZRnsw{[ǵ?xoƥF";!V+"J5'& 'g#M @dI:)<.t =|i#8Ђg[Mu!W#q@'Ar \ҐAtS*' ' ۭ}7o~'STha8GE@@ZOa\S5%AhګDi]OUrwen-kXᐍCzQTb|_0BR  iyA58U8j%YJTqR0u.+*xjI5V`l/S= ;z'1l/>8;; r cV(H)HcV$7b609> ?<яP[sf禭s%_w?luۺn40oVK!ILgSNOg8c-?ҏpRn^o4={]#"vk\7u3>?LCLoarlOy8h(A^|3f5}يsYQKy-tyDoBG_}y͋ڞJ@wn}0u0BpV 3uK@S'"Ji5BiC1"?mZ/DcO#{Ef㬿F)_cjC|u?E?@߶2MG3+rm4-6iΩ[7Ӗlhݧ?aܘ)swgBC1ױbψʹswl6st1ƓSyuCjB-7iuxzPV^vmq14B=?]xu<Cs+MQDc]BQ$&-(4ӌ+$z4@:OK"TxW&^7Y٧8?|-c伣]$n|Mh9p]# ܻuVoY&uoI~פu, ?FDAmٮ4no;Bo@jf|Zr}J)>o`RoH\J7# oE\juqU{q~rCiv aaWWr:aTJ[?@\z[Ɩ`eE]5KTΩV':^Q~ŷ!)P1)Ki*Ms5: E=&Mv2T3Ō)H4.%.  #ZHD%VsQ,Ѳ=Jol٘.sp &T6$/Ùʆ1i *tQUpcc%\*R\1qc@Uf[ ˄`Alpq d=N˞)Z]hl1탴iimo䌉Հpp9-!a C<5:VYӡkHYIgy|b|YM?)jkfw-&H(6yB]Q89C#1JgcE'XI-N~KDNL.HdD3A \e J`-&dὦ?B+dNolS;9"ګ'.پ1d}~6Cp}^ Y 4B`2=l&.2`L)rVԥ}mYxyUvѩ&O򵜺b'Ӡ`.FǔĨAu`Ud@# ,rP;kr` + 9*x $2BbtGpf,.J枡`ol˻rVsƱX:miue_}zv raEނwɖ fW_ U{.Q5+|xm~cÜ!ԑYU6L:' H#~0aXrHrmSZrD)H쭜 0%4.Uȣ5%ԽX˨H@h )Kn|L.˨AQX%LD@:FΖw|}pP:7 j&[my{{C7wbg=[5ROy&(1dO,6VqRST1[(MP^KCy"V%˹q$8R08Cz &AI#ƮP޾wh5,}* %{4KR,_l?QmQ;r6.@ Ɉd%)aq95];9>5GiRA)8I0q& 3@jԎ3^pLxHYu)ш0 )E\ML4hu2G2B,BwFΖ(&ex5ETY˘y%h'm:)`*QP|˞PH#ƪp0p LBr@NI49WdTs{H?,SC+8m˧WcIHij Hkb:4 7|ϣVIX^K ЪzEPОLł}4M,v[>PJ}{ gnڦǔUTh4!ҵu>H21SQtBYfsvTS( ΋Mى@i宦7U3lOmow5yȟQwU 㯗Af7˯n hyMݴwޘй%;U Wn}ȵMɚ/]Qk6)d݆r7D.w=vY߮wۼ{]}{˛7?|VLg'ܹ~?SNW7we_HٴI oW7Tgni%?jfi`K#9pQ ^ygiФ|&67pC@1L\S2$]) tnٵx.0K*ڤɵ4؂ ҟӦ:Wo#SUm/SZ]F4w4)tt}TsƢ*ahL ؁4J_LmsEi|QenƜfwWŴF6&W~j.|5=}p c-b.ܗ0Nw+ݮGhմixLor;`k$a;GaD0ìnĴ1 G|*z4<:ѓ嘃tTu>ɮQ;jeۀAヰehc՗_w<,-Im/پM@LņKBT4b!IՒu'ťDϣqӵjQ0;> qvrg$ !y?X?O' sr'~?кLS(Z;IO&O"@3_~жiCkg/|qMS7% +$\F8 & 9ҍKͤH#aV&v&8i RBG֖F4 yA10fč a:%0IgƆp} /@>UJB֚q#!2$ *K(@̅tq^ƽO$M.};?۞ag`[[JVǭyG~g}M^NzQ.&ڇN*jZFfr{+V +]wj}o*WRHx?F F<>k댩>W{R `Ғ E=;ΑyAɤ.+F[֙3HAM*i*e4u{1dc㎓4_F:B6*Z\R=uEQUw ڷ&GdW5n5=Cx}_q̽2CGqNi/R넘Y(POǗ}$iN:$j,V#t" OVDyH.h@GPz(cR]gհ9fCcnN kM|KLdxi4Pe8>ȕU fz6>U%Ұ~Pk15リmzɍ6v-}6? G[t>fҒ"+0c2ta~磌&.- bWojZ9FoGR2s;I?D.u)Y-!+[7!1&0ˏh3OҠF$6:fuM"f)A)Q&,j/Irkyo"I~'ȋEmmRK[lþj TMm{s|!+ALkQgQbzySZNj}.QN>*z:Ĥcq:QivuBv'PgBeLuA &4fE`\`i/ 17u_.^_Cd 5CvSJI:(CdȔl=TzPuJtFe qi‚L[Q"6$FD!'&B" ˅WIi{CR"$RCrF-3>16J h%ʥhr&+EJ8.I5͌yUzƮ\(Bs^p-m4ݳ/|oh4<>g^)FXd(wLǐ)]DҤ)0ڀ7A^C{. ^hS(+ۨ3i;flh !eiz)83l k7;vemlgނ}R^% &Bg`1/qёVҀb31x[Z.Df0K]FB&TtDΌ7$-=O%s>fY Qk:omcpZԗLaL[2leψ=#>iQb YF6"RZ*WpJ&ђfdJ#i$QXSDs}04M3R {"y?_&jjO8[|jZgSdW^T y'(Fz-rM<<EBNYAϋք]jwʇa>Ի(ĘS rȩnq=9,n*6jC؄#$#:ՑU~VJ\V);BRb!eaa+!$ EɸPBXιR9d=} 28@I}M2/y5GE$& (3,\& *[IΔ^sǫ-r]߾Qo%y2p2eZ}+ѯ'YLȜ3,4^y2Q6M#9(1  DBQ"Dlg-$=HL1 Mk ZD^Xuo`0ѱbS&M?!S܏uz<)뚥/-v=vB{˶YNѥq5S (U4+(+-"TVЋ#Ofڨ^I}?cIU\8]]/Nj_7uݨ}"k[q}鈩ɖb/17b*:'hR"f=H #t^3.%cq&xwnyڿz/ޅ{ʠ%̑$LU==t'yu&$$`^!\tyFJD̻}\{KlKlIel`>AKf&M#GUW* s$x.b9+n@ȃF"4{RtiVjXdDh%>mb6eX@P2HA)C鍁`s&hx:Ơ{:R)MB$&dgt8pȉN2LrDAHYL'i!UQh ln\ 63J8,® N\:ޑCf?z|Ư~?dVxa>n#2%g-#],0QJRĨ&<`-wHrEmjվ{9w4}e2- 瓫|c#tq뺤||9%߽ot>-57,^?P/慿_\eC-虛x6'|`>N]Ѡ$%нq 7LnwP ^v¿chfËٕmjz]Tqt<3 s/5dvpQTIP!bfz=^S/%فٯTL;zy:2~]%X.T:T`eʜ@|3] 0k %!,F9UJ黿~xg2OBbe*ÝLjdt7^ye_ru-E߲M.燗cq 0{|;RWmv߻&v( o[(9+] Mn!" TgU+tUv*(AZu3tU *h% J{:AF+uEu=誠)Z;DWXr*p ]RbJZ!KtU ] BWm_:IRV)`|=jhuUPr ҕƨ.]uj+tEh-~)ҕe`UĠe w \VUAeOWo ^1 j+kOvp׮Ce;ebPlAW^ZAwO8zLr[[[~coGN|L7aUeb~U JN^@wK \Uh*+2sF4CvCmg r;m[톃 Y.ɜ`g]ZT骠 ҕd3tEpu Zm랮Nts;CW.UA+tUPb|t%AI;DWX ]Ztfv*(JA%K&;CW.`W誠JsRud*pCWh`Aً+Å]R{kdW"mŻˀө]- j;ZF:i\&W/z1Wí5񵡫hE/K% ^~Ҳf.-jL2~S虗⦽8ۥAHh鿍Z4nsQVs)k,JX{%!nVr9NiI?9zp4rϾJ)&W(WK23ɕrE>Q +=hC\\)&Wg)WJx"paR\Td.WLvۅpr%HUv&"c4r~bsotRJo6:C b0N$WfG7YJiA)&mrurY?R;ȕi֮Vhra[:GJ޸JN#W)6~WQor5\3 D_ZN-WKp1'eo_x\&W/zhKg.l6roAkqVs?~/mל8^D±v۠g uά6(ن/bgr)N#W+nRڴ*I* ɕ4rDȕҞeQ6:C"IDr9pYJiOsnɻ+2֬ˀF}<-ջ+'+lef+ wWr+1x&0[rKԛ^\)e9UMvO#W{Yў|Ed&Wg(Wђ~"R`g#4ŠҞlei[j?GJ67S1`kYjW\7R;hYRJ'\a*M#WKiRZ˕RMQS 3]󎧑+2\)-:MQćDr%>3RiA=I (} *abqW|2qkW|/"W&+GwJirYUnO8 pYJi_\)%o8r9ʰ{bS27ZўJQmh\ WqxZ\Xr|t,hw0wZ9XڵjڏZ6`{k<+>]cm%xZҢAagshީ(Hg^}ww`yzbDٯwM{znvzѻcoo})'7Yy{pOO~ڋW)o5#{l~z`'>b/^==\QAw̯ <\g@^5x scfʏ Ϡ/fhF~Ozi/A殾%Uv,%(XVomsJI$ޔn[l Oן="9LC@1wɦ%[8#n #L|_ݛ\lbZW٣ T*JfP='!T A`z$:g:RIͶa)!ɖ쌭1' |1Et<dywsEd{cGZGwH}fQM>DPZ kTRJ$ڙ)_"$U*{ҺL@KAX'+5/V_dZV!93trr>yR37g,A uAg!8dGhݢudV@NPN֙/  *CIA) K <=Jvx Ab0O#<_ x;(N2.RiW6c :Ʊe60>I-`1wj ]K 4cПK.+Q5tɣƬ 6J1z/PvPc@n78(ʡ7g@Pzb8&@?AnpuTKPْ 2r)BRQ&t$vP D"I6b=%jTߍv]-pbb oDXe$G}O N+y;Rl#fX57]Ґ? e v0r0DŽ0Ȅ iAC:SmAWP ,,tD:M;&s a RPkJ GeQ&!JA4aJ 2O p-Vz!訽Ŋ2Y8J892rAX/q;jϒ (Ez@?PYu <+)H!xP{D5}TSR!}kcK2uDR,[[ZDFࠄY;II`]r/U+cG=Dɇ (ð^/wtOX+fӊ1C*LUkd- !hBKc'=u&~qYo_Ron6OZ /bc.(یZI|tx0] upХ:`qK?n:@G6]'*fi#fAMX54zY~0ĭ7hRy7}s.^>}{s]\$ty(:tugP3= 7`߅(Hu;5W昬58)kn̘#\fã mj3.5u &% xKddrxn9lz—g 37P3:)j+2\J`[P\MY!zzb#8!YOaV5P?DR6Ԇ%u9Xwd.P j`Q GE?pNˢ#SĨIFL >@`U @g\8!.θ$_r1~J\S$ád;W=>dqFiS]]Sj~a<ӆxhI2܃6'21VnKcU`%1{䃡,w&IS'6$@8j\Owv~Lʖ?MF)AAhSze-=-,B^'XD10e#56فұ^4O=ؔa9h^2 K{=zr: p 5h%S<RrW.>z{J7pikI-ÀK~PKgVf%YVcb@g?}|XK-kdE!`qXUmۭ)1\j17 %pnS3fӲO`Wx d6?p`>Pۓuuy5?\B]E*{Rbv|`?[ܥfϓP 0v=\ voW:Oy@"=ࢬ4;`t Zz qu)Iu6\Nwry@E2#bəj3)Er'AS.$l6tf?•z\ ̹&+-tR%%yDr5Y`9:F4^TtgJnt! S94['(7*'m̷'{t+F uSFga1Ic H=iEMN3wKFs!:H{ 9<_/{#Vk_Y2sis/ ;@{ s(m>[h? LIYܹj2qA{?_OW|u=5vu |2nwmwWSf{DmFM;3|>OI*%WY- [`gP4%YGMH!2.KbO?g:_\DT=5צy$'2 4ӹv%a $!K =$ %:!yz<\wa\y$Pu*+k4Xރ“"6ΩJ`ШLБ'2,ffc'}diZtه}OGE}{v#P#^C ]|Q!-v퉡ƛ7i~9_O#X/hQn9G3mZt&A uB)OaM 1K|;횞~^"+>%mWZ=}Qv@jWt(|nU,ެuT)%_W9OybOAi`Uߍg- ٌAO/>`Ͷ0k 5}8q0&y:Lz/ $ٕ[~n}A;Z%yt 1G+?"|4e2UKpQk׼>eW-鶅QHG_Ol:,~aRp25AH߰4Bi&3w6ww:M`nO?фKmJ/L-.2_l-R[*G1,SiSt";NYr 2LrD92[hYL9STz(2C>괸=Qm|iŽ7[ |9Ѫgu[4f\~;+cXĐO ( $O7>ճ/ <6弰iK.@,\iT^O cWB RC| 2wωGsL!3!{͉%oLu`{٩D@Hh0gtK&I!4Q.F1Ybcp.h=rz@:[x7WwyLoV~[Qrtg,t69"XxWȒщ[gSIy$%Lr{{:g |~D?`/];qnt^ޝez+KW(Sx9sWLÿS>+gAFOܲdeʙY%Ji|dsaLY 2hf\HrJ-GuЉ^%=_vCyrS򥷷kr1Ay]LDeJZ¥㉛(R)X#4XE 'o/޾ö5MꛎHE^V;ݛVlaߢ:"\[lM t(lή;|@gE)Z+%=Ts_ gUe]UQ.<Ź!!d;}('$i]ϝ3-Q7@u4s`୍c!p$H114~ޤ2޵~"mE 86:ާxZր8x&+ GXbjXOWC'7Zr_/ϓIǞm}Yt{*eԹ[|6#|*rVT}<1ޭ^sZhoZ[^ҢDXO=ħl*r4'i=IhȕV^z[eVru2$~UNua?M:%%&.CtR+e7PgiF霩G#,6爮a?xIפȾ_ btYn|(hm۞VxL_3oڷ&JFEmu/i^D^`E-VU^ G|gWxaѮ{]_*td&KsI|ᢼ[+#sR"c2t(a1X#S f\J8.pR$YJ6 I 5Ƅ̼?JM#B_A&6u !nx^y%GpmeOAy6U~WSNvBT"%_U9bJL?ὯM%3gfFǔQQ*Q DQ,Hv@ r͎H2O`5ɪL=2:pBFQN8e%إQܳ)؛89(ˇd,le m qbɺ!Jvuu] r"r7xӀYm1h*r|ytr:>?XXd"TXU megUN0FXf.䐴JNiNu`I#LoxȆ(e[%i&dbZ#g,uق,GtCԍqHF4&sh,NHǞ9#i;渧,4)8I0q& 3@RjLi/8H&Uvn".%Qf6KrEf1Z̑LP:H*jT(yz!dl2&hs^e2ĉ-pQ'LE: 0Z?0ׄ;cC?eZ'WCZtY{z6=Mr*kidOeqj#` ۣ7,Ĵtz TĚ* :z'糤Xd3~2Ka^lb14f3s`홦@ڗ=|(}NpSw nޝv);kuQːHׅdc?QQ6 ,oFPzy߭V* ×\{!7m|zV8]ܴjnu(fK!Mσ;4]Wѯ-G(uSrjevloҭ h]"]s ^WT%1o'c/ฬ Xn+VܭPֱSEw7ר.r6b0 osGwS[nz}=qc#Ľ')/bZYZpW]JqMQ娢 =h'a/Y=G"2jBFL嘹qZ[$Uu1D#PkRB(Ɉ1;'TLG)G"TvUg5JI%F~6Mqyy7'ne a&kZtVMQ"Y%9sVq#8H:Xd[&0G crRLߛAF׳rѲ5 *UZppMɈo\F$GĢcG!c;Jr.Rp:ze};%qRLfFhvsfy%%tz+0zfL8]ŀӌx4V^9id [JqhJ'HKsp h+R]Eg4rT26Z.Ze 4d0.`H\I/2c<1ŸX*w/Dʰ.Z|+ҿ%,%Ԩ)yJ5;J ',;xWvNK7?#_`i7 杄z8C25i YJZD8ᆈ@qP{K&NɹVRrNl.d6'xyFY\[oHЬzcɻĬHQ^\Kč4np=geϹN\LqhnZvpZ#Ѿ?.[ZwN->jVFpcQ蓰]/ ؉4J_=4vM-izvʳ ПiqEʏ/ Y s~hx~эwc{92O?]|`hw6$ƞtܟ,IicOZTדv9г>S6*ӽ.rSJVh|V6tR~tqpJee[&|Va;l%uDyTwnFP Aڵe썌z){J(Q6:,0-u`]TN[C,ǔa gh!g;G]z&a=Ҩ&0hA TY",XTtl2Mu%t0rKLRޟS/FKkh &T6$ZT6<ƈIdhUViI#:fJZatT5g #eBq{ւ1>b>EfٹL]ȵ}%[5{ױh,J#'J _ 1Z0& $D98 ױ#ұi^z ]*ib ZmZw{!P5РmT <8&XgLɸ+@0%%?#r=+ȑyH%"MYg e6IT $^9P %@x>iSFL1@FEk_+PMj<.wDYY Dʘe.x0†l;i>o.ڏ[:i;:Cw>!)H]L *POF5@Ɋ&$,6V#4" OVS$? $mΤD쏠[f8\۵k1 ~yis$߲e_AvAFc_{aV^ׅ͟$ZG5> YShpsm2Բ<ސ$= ~s^M{7:܌7ם~ . P(w^䄽7<sٺ i&ͨnn;gjBs]u/j >SO.=z%Hx֕.Q \TLj YRO7i `0 zהi5¨3OҠF$6:fuM"f)AX ,j/ ്Mj:R ҺYF \e3 sk YxJ3Ee[5q;B7,7RxԞp͚,?'Q~$4`?=|%Ô,sC=nf}U* ˨'>)1XQw:ѡN{DL^):c>hv̸T9K#ih]:JfO>` Oc|;9Y7J$!2dJS`*%}I J824GaAJcV\df#g姘<$$ZH$ALcJQK튚Χ6و!x!2gc_ʠ# 6!Dgbe0.x!2@,:T ĂI^93FT'\e{XM%tS1Gk͏}-"V[">9qQ"bIY6H2`:4V%r/6٪gdJq Iה2j}0IqM3b$AP;'M5q[ěDړ]-K,.vQvOvQ2M[g,W6Xނ$/rR0SrFd{xx4Yk͎}l~&lv}[<]kN~ :vamAp}+E?]]&ͽIl>85lJ淹P#9Ց5m $XDs[dy@HDyl{؁!~ hcA\D& !dH*!SQ%8/sYQTE,4̐2AZ+A̅UrrF&n"Scdzu5qv z#zDկRt8fsj~=\~cF]=f1#sgK$}{Ȳi ĎAir I pu+t5%T!FЩM5/XI),'@R$):A:! Ev]}8ەxK/\M!oڒsy}#tYG޴~gP>MXr[zJՠP'fG.'f..*0U*5`7VrNGڨ⡒n^ע\Uid6Oסh[_v0v]>‹[ qs錩׿{~u.15KP\bJɘ''wmI_Q.r%raSJ&$e[~3|I)YE#yTwWUJ3.dP[c||&svqzzcjޅ[h&30 zy^uHr5{Vu8j9{uH*)ul}u>{PWM1 bD4azRvm ߶dN'L@)_~Wp0n4d;Xs^ 8J8dBy5Cg.ݫG?4ss/WAZO@3X iiyk2M'i]˥IڠgBm1qcvSgapYV|~}Y4M"]Dr8u%V˚@ϼF+m4UD}}:C*MÛGuk%M|=dea55lfWmfପ.GR]Amfsy'ejw66j{$S[ҫu0F8LV.Р#K%W[YrZwݒ+TZ{^%'QZ쑺"5uU{x*>&P pPW/P]Ia]rE]juuU?KTW Q앺"%{ 싺*ΣB%z 푺*#**/Pkv]*| sr98+.3Օ [࣒͆uK.ɲ3F٪TC˵Un~sBsva~;+,j5ٳD\ &,Bo7qfvU gplL ˬCbƚrұVʬ4kvW6sX/7`p5V>|zGh7n~mޛ7nJ#ێ6(P4۬\74;cJ5óv>8B1p'lf :u0fP`oMT[olKqt;ggcm.ont^?]OiO[rq1i?o{2:G,dܜH3/mpu"u"u9eI5n`O8vܡ3&B=o!K 4Z@e28˓6BL$Ӵ(#d XJk#%ưK}AYyZ@ap}'B%9VSs%#I15g4ȍw^6ֿ>*/X<5,E/;v7E5]bB͐g>րvML#“crR)6y`F%}K2Vj91%c3SvND;k9絜 \®/(@Er%^T¥b2)2 1Z0l9@ $D98  c;$c;^y@*|%J~*Π;܁p+kfSyһv4Ee;pFU0Fִh;/j',Î6; ̍|h+=lr+gFda$<(!ИhQDI9O$"dY #]2p( 2 4Yt 茜]-C 6ִK+;^3ƫR.7Tx]aW>!:]s"x>}aO*B}9[\Eg ꐹ%-VV-N;Ûf) O?u`e09.m`}Ey41( v;ŠAŠlg2jatLl.8hHyDEN&0`ȝZ6;a#'#Р3'#`7ɪ̽`)dFG%sǖkg }Ub}QNj `^?y]20{i[t>`s=:2Ge0>BU5xikäsbހD0R` cwƺCҞFkҒ NIFodȆ)1pBI, ^F@BfHYrcrYFm%*@$gd&2Eҹl03r֌gzWg j0!n7S[vɚ!n+FZȞZٗYJCUr8Hzc&ckmD䁥Oy"V%˹q$8208Cz &AI#Ʈ]ߋ9,lvooeĬ-)[r]q,ɲ4V,z̲'??κHa9D]dDm2Xz;nskpeY~tKZ~\j(ۑ?Hj7ÛAoeGVQ'h0[}g|df4N^yR\⯚5^C:6+I}W}Y!qob[)MOtt>yFgϬK\l'kzpͮZ?p 5Z^xr5?WwWh}^I!w,qWhx≘~8#BWG=yL}/aOܗĵڡq-a-+~dq$_E9fvN%ǩaDWxD-YrTRuFߓ3ٝxhyEd"8=4e1s36㴶<:H3;.LRVgYm NL˭5-:JS"Yග-Wq#8H:Xt+g1`e0ɭ̗Yrl*UZpp <'HX$@zNXylc SS@9Y)|]Xa`-8 "gwsQ90ܐ!P2㒉+v|\c7x4&8]zro!s@\ƒwY呦Qt?>6#7и»?{W8(m Wv6?\Lr4BE\D'i{=m^m#GEnR|3eKA0;|u%$WlIe[d9i[dzXY:<97TQ%5ݣ*4 \K6]P7m8y8xcFMS6͞LKk^mzM}tvh0VCœ~uON}M.,t/:_i 55UÚX`9F ?)oռ7uvϺU͵2y#jmlT^:K}j/_߻iͣ0'˩lJO Ƽ t94Dw"QqRsͭeA`T+> u2}L緅UJOow|wxJɏu})v+e}"(y=I) 7U/ڔW5¼4:Yպ<#u}'TGYReV,"*N+ONߝn;;Z|w*^` h g&IpZ.8{H#Q!cIvRY?8w;% % 빋#/.7ɨ?@py)ck=u)ƬgYBH࡚F J **$TF pFbh> 2$#,z,}} >1\v'~=w͆a[+y XX+6ѐL;|\oc³8B .vVԹ<8q/Cb;_c'8Oeaq'~[*7KchS fbCcɵVIc.{eO,]]鋺ǖZF̩SikR=Ie gMOR^^T]O. L(IV 4d"r#S绣|J4r-,ra%^+AE&P$F1qdC;e-&TD;WV,g۫Y]߆!B7$|& ֈ/NlY\ /%[cV]֘4fkTi!6؆ఽ:ִƕ:EBc "AVX2.!28%%?(I;Gd$9:&jTp^1•uȋ" 9\.#*ǒ/,IABvٻĚvd_B ݧ{ F@<@Ĝek%eL5ji 眔u ˧D HG4ςO!Z橏PIAt"8K~E'kY4[{w8Y ە?>tv ?tǣ֜2ʴ#phԠ ȤtBJP*B*kHRe^ eC>V!4l :%4bTrKJ#c1qGrJ1,,62B^ y¥8l-Cw~|wOl7u|ˈG$s,BFXYE4$B$I6MI'.1+32vC&{6 uqaKIr;Mb`P|L+]zg8#.6:Em(ТvklWqc!q"SFRVLj`$iA2ZE{s,2ұaBx$Qyks?֤>GduC3XEaD-"rKHĒF\9͍Q % uN K AHQG]򠀥3.8Q Ȥ( x|"Ĺ/O#j8|کRd]\qQ:6p✦As-48 0tt,ŤW$iŝ Tjt0pV01ĭrkY[2؀rZ+,7zۏcbp$*h*##6q#%o!7)`!e˃ʖ;əBhWh. ¸0Ƹ9 L>*{(ͩOR4+jY3`D(=0"ALjMhL&ʌV9W mmd˺8t9hjVpW\5Gduo^ =f1 1( rW# 4ϛ  rF#eZɋ*1!19 h}(*gRX:rT@ dZ_o1qWkk{f}ʜcL&M߿hHi޿Q?QbWeRׇ񖖻>wU\{w3g~5L Ί6JqB*0@+)ŎCO$F-SSy'[:Hf n6~ypcEyd"f;Kjm1P( iMQxi 9܃"; DJ A*K)&c~[e%e!@*.!돌#H|>JGJʒ*!UM( мx5|jg-*͎z$YD+%NX9i@'bAe*؜e )<$ݚE̿ V!~-Ǡ= ϭd.ya߶yf w[~wt1N~>#STXA4pʁZHDB}cNvܡ.g}'8zN/6ń(hqG ^j) mdب),:J؞ięzR%Sy~n:8̟[H#6bB4ẍsqhrm e`8Da,%-"lX}~ Dgmj\֨.y )Ĥ4`9BZ EӰwJ/hU78]Zbfq jVKea\`$bν")D^sbrƄ@l)}qL1T :" #u);y`gVeӢAgj01tP2J3_@|0#,d/-oT'߻fg wR'e@ vntcN6:瑽{!GwC/̩:فEJ 2IϊF̟*b@tA_evY.4# VXTSw^_*6ae+U$k(R~](^A](Cx23\ O-pP.7??;i48댻'OFְ~(ɁuQƒn? Fgq4>arl*C6@ 17rt'nn`6eb]dypA7<}|L{:,H(lF54Y(L,( gjɪQNtRnupM4+Ï]kuef"0Ab˚"7 X=<=Ioܙ.N'*wuxc~Z.i$8e߼\Mt0ufc(NP]|*V:YYI#! om|(iUs5翯{_ Ff/hn4 w^}h{$N_H'_|XP_"{_X{_83;}O>LaaSl31uw-}Oab}w^zGؼx㺢;Gh'OʗӬe$;y_cX}Ypkž^z-ӫ_9ŇԷ~3po(H+%⑪99;N B?%f%R!EUσ|I,vWtUU]5 S9מpzRz]|?@ ՠI1;p(Քm0Lf>kpP,_@ðd9ډ۝R͕)Z+bQeN~/*մ|o;ŧMAb|U?ԥTt\lO p6GTv$ D1óƚHzxE\:X 8y99T MB9C{y#D!rm6ox#m'dKV?YoK5wm[*S{n۴ݶi=Mt]9G:/͆jaĽNQ兩k@sH7y8O6E;i * br.6q5Ք)!Ǖ> ɾ8姗QCX9l>:#hRKybjHBKg5Toѹz0Gz׉3Ч/Ϯ'-:ąor=ݻȠ'x&  TEnp#h͈Dy42Vr\eM@;}ٌ5YPH$ j"`3-@ ҆`%.D&xk Ǯuz` -t"cLMzk,.t&Z4{g- $X9/v v6'/} ܭuw2NHf&yZy)!$7P7im_xUUD[^I=o,xr0yt5{NMۯ&-+ms0)9,LCɪ_=JϪe8s*ՠv^}x0AE$EmD袭 #^[ͭ ^k6]Wnekנhomjl8j;h_T{.wgѴ6xo2lQxJ<gcq3Mfq˓Yb_Tr N1WY\._Jz*Kho>sEv|:@ս?M;^>f ̑q-]|{q+jagËjPMbjG:Q8 RG_X|1(~9?? rg&{TN=F biQ\€PSEқ E1}?N-6j |z ۯ^Yo((4C(QZ>kIx;7#\t)q_0䎙Re4gkFTՁxUrU{/Yr`}Q9AyYQF# 24H')H0:B!2=!%3sc_nwCig_OL{tgWBŮɥeMbLTt'*#F0NpeR%pb+i`W"FƭFxQBKS% !YTv]WO@'uZb3o9~~?~%({ϲ_Fu9*!Jm=/yԴԉpLzx08qOYz) \KHA SJ£[ɔ%@H( Lt{f:׺{'o3!K؄Y#\Jv;;Q}ARIE0Z',H[ : %Pj/ZQ,m~4tÑ_ݜfK-<@Ϧķ TO9ũ"7x]E󧳫 qWoUɭ5q ^(/cE5BIzdg3n<'EXjV,܋_̆t`.d L8e?W/13N8u>dQbiO 3hfH2iRTBW8o ;2L=Bd2ayUCdPnv}è:p;GɐlrSq;ɡOyikװ+ܺC[Hjba)&^57Y;6kV;BnCҕW65%2?>-Zc#ɱvkZ4?elx?jyqsqNl9v2F9yM[n"#3-: Mvۣ޻LۅHHSsG"W[*#miQyկa1W-%̦]&ޠ)`{_ 3%g_IP6 ń#> ʂtrLDrJ,ҁH^(i Lh*"6FB)F (gNϝV;D!Ra.g.L)@Ygb1sqdK\RO+:u۶.6@ݪj֦[5Vvμ]"e<ɔmI2@=InD#suW 1"iO"Q`[tb|ieo~gS$&JF`s^% I ׉s\>t\]ԷY~.|9>o>x >ƀYQ*a.Qϖų] Y6⣳78"^su=Q'.Vtn';TG>fzs~rN}fji)mUuym~ώR-TwbL82TggXS19"Ÿs??nTljTLnl8/~xuJ4_w_ߞ~SB~s헸uT)$ =#u\N`$x#=}5.e=b)q% fOJь "sr\TLqx~O=lN|Jܽwv^myvL ʶ,a/ Pei>t6Rm%eQTx,ETl<^yGyֹs<)SjM"w!}ag9lˣ$GIyk&a#A%pNKAAJ&J>j _ ́f5C;U@[M~v/C<ؾ9c:˟_|6b$;%,FTkVJg-!z=ߊ αH^}:zP#V R8c̙uj! 8" S^h#I7ơ[!PPg"JIB8$Nd()Gg :*%q(3qéV萨<vH^Nqf^@᳁xfpߒ^f- Vg"C> PC( qn-tvw dF-Jk;)1q8NJ6 E 1"<DŽ+jy"A!PY(2DP$KM ڑH\ Ot42Lyq[.V4H: 6v(1?x4/] sR#ElS_0i@LW \HsF EN&pub辛RI%ѥƗG2#j_ UJ$2-42ax}P+Z嘦zQNh[䘑fig:=|i py| &C؛ 5U̴TgpTHXe>yZ9* 1f!3EP9 iuFGe1X + LY@*6pu(X\N[9ݢϤQkL!P?@Q +YQyi @R"FPm#GEO͗⛁n. p;Lre6j"Kl'b%K% eI#A")])Xe(WRƄ>4fr )j2,k- E_ _h>C#Q  ԜN!†ZL 5MȟJBbzc(ءT+W5^TOQTZs(ӎDcLglu>Ɇ& y )*A7,ZsD  eC>THl :%X1H*TMR88%[b+X;,+^ff|rHX~:7AeO.CM<uA6w*jD$h!"I)h=E4C^`S6G!;{6,&ȠIvD 2WVG)qGl7%PPvPCn_E{K \. 2$4D@:xg &hYsְ0>})EaD"vU%%ˍ" rp/*(,)irft)># 78ښT,eqy%Uqz4őgΗSLm8 hE?>S#p"eհZJ9o ᣷X)0AI ߧ&A@Hp8[22Qk3n] ~!v ӈ)r'YBrm+0.971c.O:8d)h qe kLHڅhRZÁDpuGhkW3b<0r⬯Ɠ7(`7\IM 0_՝y)1/=&!#sV%Sg 3),CqT9* Y@/}8u^/ ƜcSFd^ ^Q6lUq\t32:76?x1VnKwe'~q^x [Bqm$ aR sLxnA28>r(w# Wנ;Sqjăw΅y&FRC_R!,/vWRH;KT@ i3a]!ﰐ6+Bu09tPmyDU@S4J,9v1,ɡ(wQ@0DQ&j :H $MA;ƁZIRLSekK]`w^xT;uJ4\_*Z\Js ^H|tL/%h| pZSo:<^yZBhNkBt:Q:«.85C- GSqq㏌#H|>NGJʒ*!3}ˁW]ug˙')I< jDN(q)Љ(|`Fk 8fIwˋUSp,jP~nRR6S4936ⓖ"<WɱcGĎ{z=(oφ8Ĩ XkBH࡚Nƃ$%E]XdU*T8#1OID#0.ahr ջcX+ Klċ6,PZYl*?wwS 4bo0ʀzq؛b<=^#جzAnpϞ?G /Yw@_ճDžo? fzP8QbkPKKVuZFW֩TC8Hw֟uoםiT腏 E<%,Tq^RiM\qͶ1]8!RcPAvҙ&A{*2RD-aVSa,'i HuCom4p݁.2~s[2:,m_IwWبFRߐ V2oʼnDK0.:&ꀧ &AF[C P Yo5!Hq@'AqA@H RCRPYup<3rxOa8 E@@Zø2jJ W#0bprVYc]lqd>EJ{v}w0 Y/lm/ U=['o6zf`{'uR@hJ 1VpTis.ԜuΓ4umȉ%,% 0u.+*xjI5V`GN_y϶U0D.ǬPzLSAhɍ*_b@xE>l֟ڹԹnn~P&A߬ 6a}B1] ] ]NbD?5rJNGqZ\*mKEulحq(ى>'aM.>"rlɛ, u?=t~tqk'kqNg%5I8M䊞=rQFooSil䫳ђtȍXBޢ00F&F9@]RM4?Ṗ\dyke͎F41En6ϋsf$C{z{AkⱋSAlCnNoLjްi3O.itq,y+{5q5S^T=C u SWAJ Qxk{OFzٔ. ~_磷Oͯ?MoO˯ih|f2hp՛|BY6?^(cA9i<^|:ƙ͞^o5v?<{?Nc![ Z1XSyfO~nm|u՟RV/a< e6V޿mjg`_3(w&6ћGv0=,K7@BFvfkjof/ "rXvPax?R??i0K|{s懖B;wG< [Zoa8+e駬7hicC S9<{{V?)Z?ǧm5b~utݨ?C}G-׬,S~{ Db}sI`IQ/9ahWL$/fa7q1_&-}+ A/j Yg{YY=GlĝQ>k a HSpMCK71Yͼ޼kj٢ [h>뻏G]>t<>TwO=lR5(|u4zTD0{\{d.::,`4Wl)j5 vyV} eV9uR]slRe,:uy?!J7 Œ9K˭.#IXTv_xs`xp.$*8ʝ7\ 71qV{tHba(IϮ0 R[sJy`A|q~_{[3}ϔ .KL4+KLg,{gBbϧm!0[<;^HUF}k'7קO)?5oD:K$YʁI,kK{XK9>u7GH [I /$ VC1Eĥ/|VT~Oa?n yиOof:f:q7#iCUx.|lȸe\f񧃇p>?M2 BXĕ$ @Jp",ZL1ԙb4SfL1)F3hb4SUr-/SfL1)F3h'3 fL1 C6 b4Sf )F3hb4[L1)F3hb4SfL1)F3~Sf*L2ZRTjCS=.'۟ۗtmsqU*[nʧ|njv;I] p HZ)lp`k=' qѲ&xYX*%HvSF8wnmߛ|-uzpnpmZkQ]s)<=}5}lѤ=lR5(|uڒz&` 4,C4:,`4W0/EM`gE>??TGYn_XU6d~60EI%`l[ZcI"`#9Ad-<*sNI}=TE(KL:Xp=1c!܃(]")Şx ֖,R1G*rneO+2,"Iw9l1xEbD h`]B~ox[ihK+1sG;XǽnLf QE0|ը6Δ,oh!@08ZU X;}w,'Ҝf'UuWmT6=]phg^ `uU8bJ];MN_`,L|]%=ŏ5{@闐2^N83&ևh@e*jv? 5gjH2wRB}ɷh*k1uC@6:/.'f&Pn1hviຳ>_'-4(?_U?04Ϯ.>Ϯs5~˫ަ=,|x=d nz6!Z-^kB)Z֑M^In4Ā.qIw覫I}7{}돽x!oRxHug']nc vWfU+®ˡut-w3l"=|$nHSDP09L6&S,+bW#n@(|`0Ãd$Jgb9^qDUZrX02Rj "&Қ:= r/|gxYQE}R9r8Q{GAQL3C(VTcLu4xS;YM׫HwD@4:&(fEwZT<'`@vz:9^tjs|19*Awu޾jŲf9o}1_fG\lSxĐb`T R@w~ָ*\^V[%in- Ng.uxQbdFfɿn>P#ƓtƗg$#y՛/~~___;D˛woy]%^" ۏhZ45MۢiIu·iW v.XfJ܅M"D#^x0ݝ &&؛6H hM2⥋Ȋ V* wVz%eD鑳#=}3.e=c qy`DNpQJeaGZa6HVwD[DK}^dM ÷c 8:@vu klwYyeW^Vw߄U9myv!?~Jl=kWr~:CHcF1K)-,x+;LOU0?udd1G:Ic[8E8!kTP-ٓ r"HXCjPhT,U򮗱ݣh.EK{|6vOz">S{'nj q-M6Ź`1X9]-# dRKJ'Mwmˍ$WbmHͩK֍zwA#?XVԭg%.&@69 4. ˩ʬy2Td` i]O}(c99<2o]+w8QZxA/{t7v!ߨgGk橄>SC!~s7׉xr9'9|&f!C؟ayyK>5MP(V[a|XrӰARaoݧ5 T1 zQ:T wZEP^E+ |]nc/o9%o.? gg$Q^液@b2Ka3LATIlA=zb,+8ܿt++|Nfk%\s.ZPK@0<(.SU a5uJu6Ո 'GjmR09ٺ+'͵+58qXe0.3B90ʉ ʅ_Ӑnرte\m~zo_P"eTdp2ej4},&͑ Epa].8 $|BhSj hHFdjv̺ZC fءw2c<$XX{0ؖa`ֆY͆z51Qs<1xpijef )רZa1X2FPEGrf,jEiDrẊT"l}k26*2]L8M?lˈj`FT#NkԢ(j/a5HtQQ zU2a4ݗp<Z`DikM.QτŘ578cD%-HW .*2sFSӨrCyɶE=ċI,OVZ .ਞAԔE|Xښh/GSvi(ؖ|h=P؀1S ;7|=Wg^bYUFy dկ׮!B2SX%+UNgQ=qB qT#8dѳh0\TLH-BȘ cmSrdԂsj}RgcAA#k"Ȅ %ma@ՐkmH: B\8MV+1~9 NǃWj#:`{WQ:W+a">1KnKzޔ@#^E`%m#rP:,\xtgRǔ%sԅq> Zyk$p'"VտWQX̴:Z1e}|P~uCKj,zL%5K#\IFZ.s|X̫r >H*p+'9ljOՂ''U#p%U^JG"ce%KÕ[ܛT>Sq iܕ/=_aRTKUCqr6 sSg2N"F)NZuI:9 O3I9Xb\mJCh-Otr*b~E̝1L;[ ]!\.iEh=&z3t%:^0M {Rg7W\zξoEM$y;B CcTS&gdjq}҄s^GUtr8n Nwo?nùo3(k}zFɘNAnCfg s3[wJB_^.zLՕʪPGP~џv;>=Hu?;V%8Ƥ$==5-9 aVF9nIß;JLF[0͝U=WC]Ad htEUND0demd.{gY4WJJ*NWΧ|NQ8ux% p`%Ur;MIC9ēĤ5=[ ]\J+D JX++0>PtpR:7v%Ltut%gEEWXbb |bPJ78`E`"1Hh j#+1%Al.*Tt~c+3 B# nHz/k]]Y&E +,(BVٱtl՛+1^ui]3U^(y2(J+9֦צ<Ӆ?| ҞH% =m4bǺ4>ߌ@n>O7MiҩzI s0jYP܀('n `J 7a 8H- + +kX)thվrNWRL2Jbp\Atf=m>h-?]J>1mIIdl-e$ -^ J;1ҕ\m] +"`NWRJ"5ftE(H5+,T9tEpM1%%EWXsW ]\P:>ҕF%EWX"ZBW֎>N('1 :Nh3%[q:`?@K#5= &V|O<m]ۏp:`0~R%Zc'`t F:LW' dR: jL-SX'WLj[#SkC nJPI vLBWVP( Fqqq7r3ނ[_W'sq'rqׯXi)TZּ-UNѳlY0n_G';w>LsY3=_Ϯ'Q\U3KD}$'~zgkm\> FUɰre#M^_~8fK4u%N)D whV?ֿjf{3{?;GͫݬZZ:~uӬv ̺`N#B%gHEo'# ?w'KÞfBv~m+kfoV^brߑ)y:KWg_xi 6IA"yv~-jtOS&Ճ1lT1ňGDkݳJ''xј{2>XT1UKn_ pt#~.YjwNw6vu|Xхlbx|%-[?:[mϚ5w1VWZg˞I;^gź|іk'*2ft%s-sJ%7eA-V)ε7h__qqd]Y WuIB)]wnw%NmU%&dhFz%tۇ2oݽӶ]̑176VmaG61A m60 kM0Zv9p5ъ˫|H('oO~_~>68c(6..ڧFm55o. 8Kwy)۔s [q˧|!s}N727Q}4ahs26^X-N/"ye5c@c?"pS Œ58#HΊJiT9 .Duzt(ە':p]L, Gp%+JC9;J2 VR(ʡ+k`5[ ቮ^,V$r T#nUe|3hŧp[Tbpए>ʠ`y(Ȥjvp PqC?з{5I~hŁdN?cz j5=B Q]!``"J+B`tE(DWHWj.TAt[]Zc+Ba#+i B l1tEpABWv;t4v#+p+]9++,}PDWBWʩ0]!`բ"vt(&:B*n +lWbЎ?N(2TI0g"–BW֍>BBL;HWQ_rZS ]Q<.NWu!zJw4>a̺vL~hՁ:CiFFW]鉮6vUgdKx匹SN>QsGMKz CJi|d&}ƙ4zJ;wo5FzP\fPB;YݺbV8ܩ.Fn#?,l[?|87ʂvr]{؅[IIȽE "µmFE˝ٛY?dgfٛ >Fq"[$;37"[e I)V񰪚UL,Z}ɢ*CЄ  + W'CWV,ΓLWCWJAyBte?{K0RBW>(M5\  PЕKe*te Z<HWѧGۖdH&n Q*T 3F+ TʢU;]Y:43;oqt` Zah35 C#]RzitSetIT'#Jl)]M~b+:c9MJy8 a$Ӥ =&1!V45ۀh6@$DW0Еd*th߈,JJ3]!]1 6$CW.SЕE+iteQv]'+.)aЕu*te*;]YdCR ?EW.M,Z}ʢLWOBWR ѕlyB/%'Lʢ&v(U]#])O".O&wesWRhJKCLJtd}ah,Je2]!]E N0hr ]Y: ˡ+zmO,=/#fLQI7$70t:΄ FehϽN.)d١ r_r糄8̘=.>06M ΑCU x]Y%CW=,Ah,J&RDWX={< .O,Z)c+RLW{HWL3 !BR]Y}>KZb+DH}+%]Y&CWb+`pJ|+ Xd}ЃteQRjJDJOmw]YΛ?lOWel$+!?]Y\BWɠEiF}+a ѕ L,\Lݢ&vr`]27HI?X?0晃 |04d@W&ՃUHcd@A:~DEX"7PE;xV8 $䢱h7H”Io@\B2~߀h oCUJa6$CWV,J2]#]qB)KNǻBd 2=]!Jj/J ުrj7_~k;3N r-FA^x%=2R M%Su._@~Jri499Q]O8>SDzy5* 7Y^7_,6K?[@d<ʃj2 v7yX7hV ojNOo]vw}SMnTs_[ckсY!ONN%'7S]]Y UOΚ(l2WͬGWHst^o}#`\Gr8p;f?Ⲝ}<&e3E5,(i4* ŦyXCߛQoZ@>?l!:CTjw&}IS7}x{=w$Sr:QL{Tй&eU%%Okךzĕ;Gݢ% yy@8QBC .'IiY`ج;R;D{y"Ս .x{7~K(w}8\2uwya"PGǝ` >zt5F#\Z)[ruc88~x>?^]\ Q{Hͷ{kƄ(gn9[2 9(aۘ.0)Z%ݼǀ&%8%B)k*$|棸z8H En yԧgM"~O 2r:t֔l:7Q-lGSE.m ngq .?ChIi*I1%/Ǥp$hf1YI݌-o3 Sja_/OO7Z+n3%/dPTPK[w5Q 6q\sKʔ-LS:vLmt6{BE4 t _Kq[v2 %VՎ$"*׳a\sK4[ŷm MY^Ys|u1ȧQUs*w/CwhܕΤÜU|4M)BwȆ5otrZ_?b1:h}=7BjYsϣ9eDS&8zʫcjH;PKEhQQvw+41OJ-xB{2RE7fXo.y1VnJ֍oq8h$.-ix1m Nxd*<ADz/y9f@!ļVn6q\c6VohuS__ |T^ԅf*~oܬxF 7 hݣ͘L8ܒM0T-siHQ5 5|)QuZH o+{WvD:굽i W]y[޻'|^fog?~:zAڃͳh2/ם߭08| uZ,G z$āBSE6`arNr["׻e\ͽv{&iQj0(Zӂ32, odW3+ѕ|o4+Ӑ16(8.)MA1e3(.[i{ `}2xe\xS7YA@zESS AݷkSkX~f[nOlt|8GC]]{5|e6j߽<8Xͻ'mkf1'7~򶜕PteSٽ޽VBlX+FjХf N?h<]<†V vXiD.ZTˍLeǝ8'Rɐ R \mrO3wTL^iVՎY> -KR!&$$" )1+D=8.`=biSa銚7,DfYggXlBYIL-F2O;uOgC2$0΢vƹ'8a @&%WM穜Ld"M'2f>kA >mE^j/'(f ө8ArQwWzdI R I !&KI$Cݺ %ʝ]u>o6o搕uE¨R7p)(kBS⭇D0tF0HSRmvۥ?6}F}7c?G#WfRN?ZeMcKŸizL]֙mW6{=mF' #_z*NxYUb@ISQZJRɎyIJ)h)O3tH)2N{vaQaHJ$W7\Ҟdbe2I1 R4:+980HJd&.y=xکis¹,n A]byu7X{}iktb&Tn)m+Wm jӵ%@yAg=-DL޺5o lsr8-gE?[;rHbUw<<19UlfrըwVanfHzЃ_ޫEk|Qr[8O u4kZ~jgu=RW*es׹ N" Kq/nECBP%%.XL=Čx8<_݊Beی7d3H Zzwָ4Lb=P.4 4#&m·qRɲPO[>gJ|$"0 LI ֨gI7AJ|?&HRTE2 \}\;/a$`&?; %dܡ0iTK[GܓL& $I3AZ6&w)LƝ ,ÑcNw"d>4S'[;nUzd[)A\ a(I&i&RғL`kAꜚlBj uCdr"l lOJ蔜\ŷړB0NudbR²f *95KNM{zE.g!b=H*>+x.);IȧakTOLǜ1MŹ dYJfLf5<++gRQ'2Y=+ˠ-,5H0:e"H.Ae^gP+yK2Ҳ9gdó <-,>ճ@|6N>gaB:>k`&ճ⦕KkD|V?pgrVj,k%/r RBRh| T]ﲜLC7"}<^C|{,ID4PD6嵦n(CJi=nxmҼ#Z.!o]'Y8vߴo^ݻo^wgT3Y5,K癮ZI \9.)"@SJJUS%p}d)At6{3ǰHP{bb M~?߾W>_x~g8_ߖzj}``0UСQlXsfYdЍPDTBP aZRUV*/wt[.|ĺJmyM%xVQtL%&WݙFY$:LiɵDA2jL2WOLRsN8Y1 m ?(n%0e3M} 1^ =,H2h*r'iY\LYG nh|>gu+d9J=+/mparInb^gb+RyBxɸT .ɜ(d9h4B=)&)'O==mn3a3US7²-2ҷK3&2AUѽ5A*]g͗ x5~zf`+R>MaO{y$NoG;ZWX}e0tk%dA i%禦VU oo*Z(2te1l_?wtP㾝i(t+C ?3z!{n ֟|sC[[Qf/T15h)us5#rOn!8c6FdP'o ȏS?/۠Fabݙ,DRJ|;s|ˇbi[qȉX"~U*6Z\p; M!)7sU{CuǞVPyt7 rS譀F|qzV7 Yt@΁dB{a-47YZKhe.ޛ¾U$tW-_u}8ꪯQjwz3:nqYmsO+ų MnƘ^\BD\ufB {of\QS6*P!r!boo8\b ̰E<#ʢ_+xCa*S7cq)TTxc[(sk"M?ؼKK<ҫp?6E';{@AʥrCZ['H uE- 疌= ;?kƤBh C morČ8϶ A^?8_CXbc39-F)\I"󰘽+MaB@EQJ[9L)![j g:9g UJ$v$}xl5ɘ($"pBB= !Q0mG;^8!#!rq ]HN42>H JDyDwYrUיtbTV8_ b(6aQ[G=p?-6Mr׃8j3™ IkA(IzvꞔyuU,3c'WHHiqU*掠_gԫ?(%Yoߞ}.ZEMD"H#lJ$p>漷Bj%D P mC '~ q&/KëlzIQ3fKj;1*/'ƙE_nƤ,$4P+*s1 -c#FҖ$}=*˕ה%r}[:ln}k(ȬSL2 5cl{Ʋ"O<餷 Ğ&(?H3,[7B`CBlЂ{?_m9DE@>/?.eX$L')z=8@r@ en-%XG^~DF{M*2䨌RXvmyJ"H J*y=k{͝Y-7.IJ& +[FT'~r [,!Bܔw'ڲgnxis&O&V3qxU'-/g݊10! :^P 5 bԡ16/N[,VIL쥸1?mr!!I/qՌ)DUx7n=T'xcN\"!9nkٺyT7;ߧގMMIy[͜jzX?ڼ^XJRT(נp?&=#vR5e͓vJqn,y!?t rbJL# 6\)P1Ȉ`@gxl@f^=d.z)Jg{8տl`'C/ U-/O2vY#x%}$eZͿS_b:uq:LI_-1 y^FHi,fAPAc_- Uk6S \o ABbA0H 0 KnSW@GvfyK,E/_@@sONnPc_ȼ;"h]Yz <1SgؖVu-"#1; 1Jl{<orvav |oq׆':x!I!\zvlZr< SXb$T3mJS䄥:xk0L79;~Bź=vkrDn"a9QqN MZvF9J]=B$WZ? I Pd,sO_+Acg3Xp/4d6 t뤌ʔ[z,wED)۱6j]X)R>98~:ѣ *V qb*e6>˙yǐ^@P]KYgxs!\jE.-oΨpMn5cf\(5ӊ\}&{&]z~C!K,o-yh36=? +^oK!%?uي=ޑ3v[bWWl']`!qmHK/0 /la ZWjxadYo2<[lq:eF_5O ߩaupFvс0ȱ 'lyǟT֭ތ[>g4+(dlbky%$ %l6rƅ<0(TmaT۽%NZ?Kp.aޭ%DDǎ>,5SGy1i{nImUC؃ ^QBD>iړ؝  (k&`H%Wg7]CJEqHRƬ3qџ**Ѱ-h)#l6v?\iEH?[ nI||C%4 z$mƥT<s ZaX |-DHI3q@EyĘm8d2uMA)\=ž *z:,̞s|,Kazj%b5xWRP6]xF㲼R͊x(2e80sO6E i暫hdؐ,n`$CY^!lJi_ ͔r8h)2%zG9wʷQ;!)Z5Cu<[4 I?ejS2qV|.T \;pkm:m+˘x " yPz7gQb!*NI'ῂXQEVB{RVX?1)J) @ErhF3t)${ s2$TE/XT${38jВZu̷Ǿ Di.%8dM[ (7S7φ`2 сj(1k6_#!^ve@I$d_[#/@I~Y!_DG哤 đ7z#: ("I1"FXBw=,&u^aRZ'd-p*35 We1Q^k xvUPzwޞڬsS '[:ЬRX~ZO;ywdlwK]@h+k8WW GŵBGhaba"y`r8y%E$Xt8YͰY!v " {Y:o {nI'q|/f=3o܇Hk (.hy,YgL G`y}Y*\G唢0 C~9H_zIfɍ Mig;Yٔ @Xd ՠfOU Klxu$qRrR|NfjƸԯBaa}9<_VF^sn: NJ,'6_v[7k9|nę"eWri˴MR(;$$N69XR07yy ;{J( S=n+"̒\dҍ$=A30iڲYU^Z$Hzlgʊ{Vۛ5 dYX4h FP %5k* 85{h0 hRu-i, ~+0[L5I1Rkv[KdDNK Çs@)Hy1r51 8cŽ%,5%9gWZs@"6xU" e2fIH+LvDYB58妬d7XWT{ |msf-d+ l#߭XB #US ŖQIXnp+.…k!N 3Ś$>vH#ټ{2~#l/-0v>FX[҂ѲM(> a#oV!?-:ˏijU_y>U0 (ef'4qW3^rv1yqڠԈeѹؘR .'m:b,q"ҝdZux$oUBRgDFܡgf\p}EJx.wː@8|ףIhi0 z ٯTg7L[zsSe(eǛt) .ȼT!"D-ֱU'H[u x: ~>ܺV_6ˡ,Rr׿=^j*pX"q}KA8OsdgCjͦL5{\cڢ'iFŚysxЦQmytqeZ2я';^Axm3W7]JI1qWStԵΉ(i1)V_ /(bO9y-ia5{\5Ev[#>Eٔ'.`@&K"p$ByU=v_8J5V։MonV5 [r(VBmSy3@=0gvCTF9\C{%keONxL:VD\Pt>Ho9Tq nBsyܪ|أQ`į)`F%%k?y~s2r\xKτ@kPd4߬'BDY?'&UI|(|kIW7ʹ]ů |p!~>4OzǛ^6~F}!g:K2Ͽ7}3%)L#;v7!@e~lo}AOSs]S[ i!ȓy3F6 bv, 5~`j+Q/LZsF{oIa?Yo'InJ[Nveis#/>k覨`3qw:j*+G9b45?| 53#cywQY8E[#Q^;j[|1"AJ]kĹ6uUbzz?妖5vw}N[gyjXɝf w'w^yR 2b(T 15BS卮_A%%Yљ˗{5&1um a0r.yR{#1>k{cTXINQ\!|BRFX0K s>'JVwFHroW&.b#>ptQޖx9)zq^Mg2q.$YaQ,rĭ])"&i+'TЇP`}q{tn6pf"d0~r`U'%a?ivbJudvv9MY>~%JJB_3_g-˶GʰuRtqVd[?yl)CnaVQZ5Rs@OduzJ:;9eweK0 hw8!gǟ@ȡ.@fnb+Ga!}? 9m]<~ AMY*Q!oEh'l8}fKt2pl+5:uxv3 c#aLk0vzY?(ZGdRPU&&+d:C\(0Pids޸LN,zmR-g,^ ngh63,rjU@c KT)JkeYe1ػS~] 6ePlL>d.%9 dmXa>c Q Ife@V#A_SgMy6{3-\FVb^R Q" "mi PR@b pl,^x K&;Tb3Ȋs2BMC!m!I<dĩAaՊFNeW—?6wlH,i[BB}tB2JzuIȔWC F: %5ȥsWIzK{љL&-$ +*B":-]TkS@0-#Z"|YAyoŐΛ^j4r_,rDԔ KKj* ŝMSC! S= oi0Lh8}[ެPXH cf;rZcG6 (^[m-Tq*[ww?/J襊60{mVj#Hܣk]L.G, ,IS n$B m]֭ej1(6—yHH C\+NgLF rä=|)bIG;vM8p#+ ځbh`bQ E+|5cƊ܉h,SExݩʦF!'A`,g(vIh5lR8WM0@ G%3O8,UJ^3Kf44C. ܾdCh/܁h-MV׽H%1n bVIM<#AY@({]&:\j׮MRQ AI*0w;7.:iu ͇\ȇzYPzg#,T@"$r"@uv{|$^Y+UC^ S@rM⚧IvV XF8Dxmx՚Y^ ș; rI'^x_:4z4&صӮ:v blK,Q~Bʉ ?̸nkÆTDvt('A=Cӣf별D G'$˵&V8H8xA8n/2+7r#o?ߴ)r6> i*~oww[Eߤ1 o~0Ѥߊ4Vk74Q0NA8\kb}ZiQx\ŞO>IoP()Z|߄4r.J ϐ( ! M`()aTk7kLradfn@Ѹ!!<;6u([@ȥ;t>{&>3}T d-'^r,i)ðM@rcNKU yn JoM[3,^@kᅢVJ 9"ůIQ70/<]o45/쏧{/7w/ ^ifX4MrUsSJJNfw;3Ԗ{O=3o_'Ò^7x ~~I7Z k໇ߪ>w8/@)i<^6Ŋ؏F B9E|ZJ@hːLu8˖adžǷE&3up>g3DR}T^'~D?nATXh:G\[XhK/*[/0k 0{G~AVit[?5nSFE[TYVkXb͊-fJ57;g~3a\~fk5x~Ym隣&|/+j-/J?oxLjo_E6V%~$9tZ8zv%Q3oK^ҷFRèq^+J|줷 6MG/Xh\t F8v7w{.V`ߪ*#pB#c̑{#FWVj[TrVtɭ"P8A!.4"F$zL>9Rۈ^UKRzXB`UPa0 €-3&R&XQLJ̼EL3xRQJ'n^}|kL$U h * efIē @A2b6猧J( &P2hM;,h{09}D90/f`@ %`A8Qi=VΫ[hK!8J5zog+]3X!ȩg>gsR6CU@Bx(cM5{uB<*#km#E`1Л.6AI[,;b;nCYʲJbT%GVґG[՗̈́ťc(TkP=sC GTy/F ,!nQ r'O&חaX~_~K{g7r"|wC#T a8KO*0{S5V @E}ܷI7@q.Ot?M4IӲt-PHk #p"kʨ|0< G %.B;ED]6/؟ƫ뽦>>O[8) )52ҜG6fD^7&?rxrhV'B'YD`E.y8 wr言?-;|#pU=cp(p; IMI_4Q;zg]]I34@1_䊩9{U#kHG $rldy Y54n8~ {% D| fJ_I5(N[?pgIehbP#O1f$H )9=fWxu9Nݳ gm _#Býgǚ mhFP'䦂Bhr C J uٯ!C Pz 6M) Y0Qc*h07d!"wX BsO4^jPWTkFg JSo12ewA"0/M i$UNh t~'u9tW*b'O@F-CR !u|oJ}0X®6LѢ* lO([$ , Ld#NJaNPnrU#-zJ,&if &`m0UxK5sMjxI8 rtswsqVn+AwmOeb'O4L WF&pVPjTRPnچ 9a`Iȓ~)f%UxJQN@gq9 JNRkVT}nDרu Rj(2&¿DT~(Ou0)| SrC<#pn;mrxV԰!r[+h:@3![1\|*hpa5)l ~L a+ zR~ۂ;+ d @ dL0ʹAWs..Fw8^褯ɳgdsXwcV^D+*86 $%8 he&Zx&I kTh-Ͽ%YUa|o0X$%ꨔF3 =*]ILT"5nj֩W=*,9TQ]g}fuaBp?8seO>8w['Ia ٘ M(Fz[D섗/s,g4<a4BfuTSϛDASPAp(XQm Uf4J2=0ax2aRsI'MD#~qlSl#4-`]-j +q+UWxnr\Y_X4vnlRSom4Wy_>C}]V.`ZJ:[͔c>;Υ6S vjTDa 3ia@d+oFVJɆX6il|FimVvu֔ryW#6s1+h{I,vgеdަ$]f,w\Z^0mJݥ>{e 6G ؃O4b;dˬ(06~b`}i|9EȽT#EEEՌuJfi?!}V0`/f5H=ሱp/Ӏg4XuI2L\q$:rzc!7B`Yg6uMx}g.#^|E(+dK Put[5G/|lNsX'v26`ۥa O8lEy*cXnvޭz].}Y+Xo k`bcm%_|K~4h0sɕi|Tm[DNx]Uކ@rzkT8:wҹB`0Kd?9p!72%:I 5w~cJV80scA YC6P#tr7Nނǀj}cݫY0}j>;#/Mܗpc{0>0s*h$(0؟fbLjUA 4e\%B C Op)$- {}x[:Ӏ0O҉6=D+xcLa׏FWasJ_/_q^yߟB*ΥCfo-͏wg {z֏VmjD~?p租zC؎ަW1y@fN{|0r?}C4li0:3)`4ɦ'Vs8zQq1Aw &t3 \NKƯ!^ĵ0[@&'>OqSob"J^W?$ um) 7bb93qu/%n.)ҹcUI}^?4e@ANԜn ~V-H !ibj:Mlᶾa:tgJŅsED1-3؟,gȥ%{ "WW41Gb_yd4Zqչ5ʻ˺*cni\kgå/r"2/UrU1MuZqܙ-ugZM[ ^Nu vT<O1`;) 5`"Ã=2pzоFh9wףr2`ON}6p||GYU7wxՌ<뎆pwZ; =^9&-\̏fl}e%jĮ7mR;]%YK6U\1Z bi\Ȥ)1Ba NqYfÖw.Њ1~5d牛 VOB/pWipŸe? *`JsOhyx2J@t,} _/pSwo/G>7 &(a??Zr;ؓȵ8 ~\| 81dx,V()oH_@@_4 pAb?md2|uVUf)Hk ?͹Xaf)O'i6V f>M'C?ha F@s1<oap4բ"MҶH{a05kcE0`?_ O_\yTZ_L ¬4cMQgn!!BQWQ?tܴ~7O^Ä2&} \OU_LI}MV$(`LeRO-m/ de_ rLUuL3}h+Phk aKe]GZou1TU ݿ,'aX~ 㶎Ni}Vᵅ0#>_`rc`M,d?`?|׺Kf#h׉A vyxuN5;3>\kµn;?p6#cܶ"ї캤'*iGFT_SԎ]P'ID{g5ClO2}B?35MլL7Oxǎ,{K yukQp21KImj.Xv@'oNdnd9vvj6 g͑Z[M̏bLY'JT`㶝>#;/.(P qAv#uAA6ht;~_*g-4:vzK<ғ:/xf0 E۞GrJ|r14 b0[6"AF}/_ o1rrf./}]bCmĞMх?fE+"Ox05#o .K:3ذlnsXY`f0j\3-:V'3rdYb7+ wni~|5:?CYQrYorDHvtz~eI>eY{IIFuqxd"FD i{/q-Pp+quq6KsH1׏utl̶rz_aw/ T& -y,*4AR鄡Gޯ8hҨ!PK˿.f]G@Bt5X;`ܠ9htKrЮQ?gi؜ux C6-;A7w6ea]Y8/~P஄tB0>tg'?? vJAU YDRPPyq`Gʤ9.b./GO?SzF(ܕ*F4v<%eT:,gM:4QdKFܹs=^Au Vǁ04ZNfXeɱ=EQ~Nj\ӒlSҒ|(WN4]iBL];Ar&w=w% ʡ_޵VQ꯳0[Lr<  MWTm }_ 8krR+xIZ#̻i'Ԃ>7w瑞ɨtx_AREhӮvfg %9Df噞ͳW~j.. !k&˫ӓF4ocU6%jOAY ڶPa{(nq}2VG,q*@pLj][c&>RIt{:cRgM!SU4dv:mdM0Bhۧ?Fk*ZB77gq{ fs!.쓳jA& GImR9ǜ+o(ٵ~3=QRbl-54)הlE5?߅ȁ:+3sa<5K.4X`Pɸ?& yosJ!m!X0OpPdpNFh^m.LJer Y*Dr3L1}|s@pnpWl"{~ӫ`oހ;8floqjq`e`@8  aN-iGH=x U bE\)Ke6w4ݨw4҈Wkn'>*Uy^<#A5J.$VY9xe5fWgeprU ׇ3 .4~^}T< Ӹc]w>gFs㜡{KBk" "f{S9o}+Нw鋙&k|EooAOU!25jfp{1ك !TRRkxt}Okԕf!5!ltѷTUKuc;e`(n K:ކA0T(/o1;!_RkV| NHxcJ<պwBNj'<^H^`OszINi;ܺĮ#(W,h#pP*坒0 0j B[qҥqg J~=%rv[ ӂ ت])!Z N[Vg`񧵐ZIIG-zP iL>jG-oZ>jʣ !TԂ+¥|T }ԢZTPcTe4.f!I RnցDQfIWVDal^Vr*w:j.U-&=a/J=,c3mNg l WtK޻T`bG'.kng lDIjϝ4QD0ۦKM%e.5NNkS.,Ѧ`NBy#BVp킳 \q23s>.\ s΅(N2@ƨTz?fBqvNsЪpwqZ~ǟ?$VW<_,O>eR=MJQP aJy۟n .w3ODSi"{80yq"jRGRc~wRPp uq6KFS|"HC` ."eES ! %RXhAdH0*@ HRnYC*y48a)""wBEi.6Rk9gYq#RhAC؊ڳİDx 6BG tF5V2 ,F* t3V2%U'1:Tib UR5 OHFg>+Ge gE#6NʨXQ#kD 3 )JQn\vWuBZ3E19KUߘ")Rߘ")}crٴJB LKuC s2??#} cSS Za`]PUEBl ?4*  BbL`"+j1GI;̍F$hS.;ݙui-YI_x}Ne^jLF}8>EcL.bEHh/ՔՍhQn#۷ӎpo$(osJ_4&<5&pA> )E8C, A.0XWD"@@{ G3DPM`5u.Q8jH BQh ;  GfXP0t\4GXЄ$,F` ? h-KlygpTqNf"nّ~ڦcɉoD( . ȂW#ZzALbOE ieDDXfUH6,XbR*9_%Q4}$ԇ(a`]D(f "Li0` jYp8HfRԐ:il* ΩTqZ58:*RX0x"i)3jR$G e0F)LFVY)JIAuĴFka { J=`.sSV|b#Ϲ/k+%>((iw^A.e]m\.O)\\uυ(9)buC^ɋP= *PsDs ZFaE$ .jI" ^:zqplE;)<:O ϑ@̏u5YB{._nzJM4`Aka݋fR:AkFyCRŀh5'qq TAjتv9[]N9^ (~_dzC=_ so4ܛ>{l #DSk."LΌ.FŁ&ƑWf%=$VYHa:`l2hyGZgUq\<9Rh(FoSԀբV1( #M#E:}8$U,払qs4WG-|Wv40NWG{QXPP Z= M~0B@Jۋuc]Vo@e2zk"(K5L 4u]ce!ޕ&BTЧ\ RmF[\y6bK_=Vߌvq9Jnrz '/w.Βź~;O 4>gS>^:[T>syC/^tr|Vߤg98٩O??^R͇nW3\ފXHUW}az yuӭُq֫M˷#9@ث7~؟څWGףsgxppE wYV9\7W?G{ ;&uQՑ %'=9ZmFFH&SF7<)qL KC.u QLMmp0pDh~= {e]L+T5j))H*%;MUK A1s-Ǯ;9_FF/ev͓-Ye[K+ >u`W4WS)Fw2dXWc.c`T0QṵVRm1}N]<;e, ԗ|=2)*] ("F p4$XmQ%\U!UUPJVbNV"ZDe%~%]V|/]̹QAJ ~%Bnh_bέ+2dv\]OE0 *&&ޓI; XJ’´ZYΓ{1a> XqE -#Ύ3|GwNk,0jݿ|-vOvujnB?.7ޛ~r::½yS璷{y5ݔC(k >,'ϖc7z@3Ϗꝏ̇vJ7 "vXCEm6-Uv}3r˨:myDD) cL Fڦ[1 [3o4/{kw,dE};q,"8T D.4ڪvZ*3g.0&)v.E' d{Y1;?oҡS p}H 180/7 rhwl~FO'fb_??5)ɃwzPwX{^mC8P|R):誗.&{7|g7r_yCV55q8=qX>/{6ktP s`sRAQbx=j(́loҸRNA@yFo9h0чR(AG=h+h4sj)ܰD`>@dCwakXfDkAɡll'Fo>~a7'WٍJ#7Va2iف>Y>oq4b'"ߋTul2u)볟(^w륖~+˧v?9ޯ{eqуZLϺ3\|4z7? yuGYyf9Iy]w|OOOɺq%X7z꣊lk~k2kvi9Q%b6 |`Ѕf)[ gz:6G^ֳ 7\|wOenݟWŲ~EBQiՃLp[ HbKh*F ז1M *so4/=sw֎|hGvJm "`,RuQuZUQK s\) {TC6_гKMFI8*3OՎ7soO. 6?hkIW8apݧQFu B,lqFQ._3C]>|d3 v" gQtD Rʚ(E(5 P.Ԉ2g+QPM/3o4ơNp@V%3y}{uC(b~嗇?;{4of"sWZ(]HsE.)Ņ5 <-wVJ]~H>-]`-p9T z-| J (r=a dKv"CBj9O>3o4̼!qcV8-bVw#0~t?Ĭ6 T{1'.B\)LĬPlxvj>1Z{uӷhtm-!FM6Vm:}cwhF^'齶W7Sڤٷvm-7^{1ޮٽuHFcO"T6g=^3P%낉> Z 6b CZ!Jٷ*`^5Qjɮ7~c GՓ  a32Fos(ϭ\1𽏠Y*Tu  9[U(ȂMl-vgzw [E:/z1N~9[Mn3tѧی>fOl=K{{/~WDM]Vw6c1œ(TMR\99+ ױ'Ш,=$-p HbI19"'4$r[.^3*zR'k0o-L']IٜZMj nkVTT =̌8BgT,*{5jUs[9 X]{a|6Q@maѬ=-VRbrˊ<&N *A4:T~"Yɖ:E >(ON h놞[3hh{I|/DPTWГIںNO֮cK B{#z QiݥډMc E#Y[BMzU4AAݘS@A  8授򨕪hpjjSAkE9#\磹*=1EBװ\6 a5^yg>,?q0~wHD 1_&Ē&FdzW?#PۋwknںEM+ecЍ.Oz!J) $B Yi׉y@_  h==F51=GPgL"cwlء!Rl33$6BA]Yi_O'4%̙c7_N&`lU.KGSS /ʻ{6)ڜ07de E1%,QO>p5y\0u&34QɌ"MfZiQyeYUBpា{R -)!Ml&>+4EX=grLQ.yT\RI ^2k ^u&e 4]l&gf!pYUËܕ\1s*=ۚX@Qj!uXnܙH0L"ҽJ8Ԕ;`M$n E%p..@1\pjR8lf5$I&'jbAnZ7?1:خJQ/gK979A&M87_/J]x{3JGoץqCޜ/~O^ݔҦ_8h^R\|'C"\Kn(oDN7z y"UnjJL^(c r晢~EK-gOőG.4̈ԩhKkz{,o'Z1 u{w?򚏅\OX{KIc_yǮՠlj~,OJ>ZlXTkL޳"'n%Al)Vn #C*+8ǜ}CdmWïC4gU'q? kJC(RP2onCv-L: mDv#Q}R ;3Flmt۠a\zzk*G,7x ^68:t=rA <\7co3=/g$|ot 7I<•J`sB׹хF=F|HvмBbv!OIR\S9:k1$o9LAӝRNi7rH+͐'T bBra4mtL򎝛1'1xm rFPuJx;b¬HQRD*.la-,vS;%\͸d.fF}%\@G8Dӽh xx33S 3s1[GU[ABZjmب!OJ錜`^Ju#ȹg`FDGUE=8>Vw Aآ(j  L?IIEGy2feclgҖ?΃D3T`8]0e?dzޫeTte3f NAD~=x݃2JztzwCeό #ѮF3 !wt?GIcH}խ=ȪDZ@cF \}ȆT>}pvAAtJ)oo鷋=76cƳh & ^{Be]KD KK蹬0 ӸqNj/|6νP]kͫ=IZ1طI^1 +uzmnƧտʛ޲ - ³1nYw#y}8_b[Ĭ}SdINbjX̛/?{xSk6t+ӫlJ`TX["j9;sQRjX"=zw"AR);ZdhnbPSĀmK_XIvY|vߑ"Px߰NOUŠ[ 9+bh 8N͞(. @tݚ8%Z]pԒϯ{tx?mБ7&2ƱŬ 0VZϑQ+姬K2Do3^R#]#0J^[d{v.g3 y2iAYĸjOr82-V3όQFQb\|1gu{6a};HNɌku\vNX1k!tW*ԓP^ $CEJ.cHu KG _ΧAɤktló};i-my+g_ /\wzIZ89zzVkζ'wm>m6g/o@b9؟% _B1c_$ ʈyzɎ`ff8a39*lf}=WkLdJWI\[H)ľ N`@/@tN~21Ͽ~滔p:4"I_J{\mVFqP]RTq'*B".)Ur!HMʂR֕qTqF.\Ԋd;5|ɻH;)'MN>6QcVLHjmXghғ*3 bsZeu'J!} B,:*V(!-+GՆ]*GiSR(ɱJdş3SɯϤUbДPZn{BU tWP8wQV^tsMdIWモVU2OnLJq$y}$JJxBU))EU[5qD~Zܫ {Ay~¦4 {7xݗK(D?S<.Lk$>쩃GQ)#w3Y=n[R6Ā> x,(#7xĕ>2n2#33(mn/ *Le%u8NZwmKr8WHP_1OI2SuZ<oUYYU}vj63*3,a,yRhS))ChQt'QlM>ũ  wPJO=xT %g{EǕNUvV8(P)KJzcK9Žğ¢総FaSzۙ@Ld7!qC&ie^YN @ii)#zs"ut;:PV`<酬{ub%ɵnd2Q:)uɔYHQX0XcsM1Duz=4w6JڬT9% AlBFx |CuIOW{7z,P*+ˋ3 .>F~Z/dJ6DڇN9xI93uBg\OhRh\Zʼn6ܱ@Fp26p0F %tc&K1KWWqъ6X%.SS$[&RYmYMWM <˕*?ҝϿ_}juZ5J+UK1Uگ~iwZ؟ ?vBFZo7^~z6ׯgWoD¨lxžEV,^ ވSTlJ1?prL9v $WjY #%k>?z@9W(;>He>4 vq `ѫk-Iap싇2 2w0vd$u?5H^ ;3l@^Ӭ[p)> %zS. 4@>ԗV )~mSy5 (䋢?}J[:x{Xfɋ KA'&˩ura[BNgpKlaR=^[Jo܋5{V MЙ-X}G hdǗ\cl>X{c,: n+ԉ6>Lިw k[QoMԄ:Ȫ䰏Jzc|lVS-g0)Y-iW[̢`: S&Ԗ ђ5H{<*[VJQ4(g׷xL*qѤ` mLLAy\U]39D9:u͒aH}} Cp;fNRrij)QN_*[?Ԧs7P3? A)hJa5x2ٕR"z sRNuDU 7U8J.fɋM3U>IER RY ÎRBU_(j̧RDաAJU6A社SVЧ@Z߭N;;;%/IdWqC'Sn 9gNm .pC6B' Y6gZw.:6%9z}L*`0z)yB"S\k/u"<1wÒJBl M;~o_[;[_4ޒudD.5|?+Fz:#!}yj$G  Ŋݝ%Zpdw%~ٿM >X2œދI}T[ G;KxJF;tr"'aBKM,B"'19( ^ (>ڕ +F62R;׋. D"A ȸhjfo9q VPDU)5l!k|:LiNa4UfL#iva[ >=õrEp~xĈ6{܂m1ˎ%/&F@SdJz@h}cؓt ꛈU~?az=V kOc5,֫~oW}9S{{An5jyO 0NOFYݣxrsψNO} {M؎͒#Z vU4gjDT[BN`aIA:)9bRwIjQOR>8JV|l_.8!AH,և;V9^?U7O;,y9 1#H{ !'/ƫ/EPwv^v7뮃X1v8V"nV5jc{A1f`_SP{iR!fۂb-(Ciz۬f}]Ĩoj,y9V3z< W,}-!7MNcR15fJ._i-)A퓖 6zƩ7 /Ad3X/* Dtndkl:ܡHE%{}]P۴gP6'Q@/fu~kӭhNZb%xS1qE=覭+ֻj7 ^e̲X e%=-+qbGͻj{ULx-*fuYE%/GPmKR&-i7tS\t@B' L,bmٻG*l56R"'2;_9oO٣+6SYcnT,y1E8Uzk+o 9'Xun+z#6wZVv#Z HO2?voѺ!Ğ]"Z"[(i7eJfASfx%@ PvCuO|#I2vGmvudj]Ut 0=[F׮rsO:AosŌey×1/GX&2x 5'BCX"CȝTA]P։'5aRb`׷>0w jˑ * qґ,/_D1!,+FʇJiJE`V4ބzkYo ~)]~ռy~Nmb6go"7O>rlzUDmmQ[EneGۦϢ!ytx^4~/9`Om:AD?eVbtkMP穆:Myoȑ9.oܚD)9ƓVND)ds-֦(ޭƂ(TS2Wo~uVV{4?/uԧ}_n~5}TQ=,bQݥ} }"6!P)`2:ӠsN|ea81?vç/4δ¤hS@6 t)RTK:f['[`SBKAHqn7 컀N{aɗ@o7CD<sS|f>aA%{Pt:j|9d zOwwjW@zF8-zTCeUbl؋Z9m^}UPD/0(dKqnE_iT#w|sD(d#נ,9hu~8)smnhU(zZ/D2Q#+cߒqjOH(Ѧ+w{[?vnAP״ue@qj!im+xN\5!T/7O<|4Cy6hTG&m uj[`[A-M{[t1&h:5hנvK[Wsb yM5\]rknJciC`}MW:/ Ǔ:O,kn V?]|4h!<%RjLu6s%2ǂ^e; 9ӻ0m4 S4Gr{1= 0>0 &n:2x"6LWM/|u~g],. `da}lyͪ{ BoÈysySዞOݬ/]S/*gfWޫV-?wfƯœU 5f6 d-Ϙ>gX?Kyı9:px8Bݡ]+^Cъ`c* d[E.&^Dݜ3VA;xϷ7?UfNhܝ*AB?'q6hipeG]5Jn"N=:e oQo.{N=koG৻܎6dȞkE8W=HZ3R"1"gz]:$ˎ! Ku^MguWT lw.qB䴔v &H:SA_ )Vd $Xź3csH@2:0!Xb 13jWaH;* _bQJLδ[htݾv(qHI%L;Ƹ̳1;Ӯ3(T߿yLOjPAlǟj+ٳ漾L'i<\\T0j"_֖fT k+ 3tc;[z~Ζs[(A%QIdLF<Ζl鯩> FH-WD)yEn]]rޗ=Rzi<զXnl0ا"|<N$Zptݛ?GQvEr0$ .8/Eh|?Sk`)Wwa0>&&-,tUS!$)9HԘ-Wrou^zodՖEv^H8G.ov8*_aYZX}/It@" <4 21FE:hjօ 25]z[ߝiδw׈]up'((Ŷˉ b+G MP+ ϟ!60h[򖷂sC!@d >(gm;Cxz-ѣskӀ.o)E2V˗; 4Mw5 n1S+慸S‚=ERr]ye{k ݚ5 GU}B!zI!K\_[F(lUFAA^Dq~yIƎw}ȋD"MHSk!H3Ac2b`dh/!-)7O)+!sژd}k!f>9 lyl( 2b>9μZ$8I0 ^+&HuںeɚZ$ 48oʢb/kM\hdEQ*o4 x*,xC$Vhh=5ݩ+{?$j.oܧ&##ty.pdlP b3y1m]DgD-$vx3p"n+{(TK d$0S4L 7F}(Y{ݖ! }=t}VD+񍐋5%"$RbXT%#TT3ҥL2'Дna+rnoJfr'?-_3 և?(Έie mf{22]KxṄN 4))bF㆜̽`]«Kx}Eo qE/h9J"x6K]DA7v";ptnU3T7m f0t+1bXDa Jq"v*"zPXj:k^jDI &4jh3r؃W-B5*Y.V]«ĊJ"բ_ R &.u8 &:OV " G `ByWLyk_<=D i`(}]g5nrvrv ɨw% 0w%zt^>q#d PT ZRE ˙\δlvN6]cû@Tͮgg~1J/'/>[2WhRh^a;\N J`GgMf{^Eşr|7m:d}W|nR~ҟog}YPVN( iP\1Zhqп,' 'yMgmI"AHjJE=Urg5Cz/2T3bB|fD&1AhK%Q1Z(jcRȓRL_#NHP67 ^Ùp8ȧw+܈X %N Zm4"̦sZFe&L­{?DŽ"xW}h?]Nvxw\۾1ߵ٦"t#J[! J#Hwb$H$,c.H5 T Ff݈RhyiNDy.74~f?dZ{Zif+nל%'8tT/c'K:+ϟ_xt4_l5aD`%ʬBJF[ VT_ck[u([A(F+H@R%ML[` @*Tp StS%Vzۨt>dZ{Zc - ") "@1zI9y &&$Je-<"VhT,Putu ^ǩ*\@&W3ȡB9 hC+t*k֫l2K\ĥr6<o$ou{M[[tx{i'H)S}*Lu^]?î_qx2ۃﮇI3G{Fs<ˣ >-}ڠ#%183~[[L.*ڂ[؝q (qds\ɆSz4+F,g0P/v*1asctELex(G !FmP׾+U:u^G,&D]!"Mod l^L+Fܝ3fl"j25sE:MՈ2q1v%vR)4R|ۥ & p&#?G}ü$f6(B\ :䢂WWJb q4p˷zqø,FgYβUhF^Xٻ7nd !YLkxq`φlv=F%WѺه4[H2ZnWYŪ"E45E\q%@ZROMT3k,YEb=ӏmlVGnWEYnt: }5 >f/[ՙY T8r *)g^rVԩm%8"hcQy0{=O6Kjx]Xqvh[ɣy qYlL4&2FX&;^ªypj1PNΩ=?AaD xlhG4/}ZK Dt}wiTIj r,%5!k/T;r+kNR\T(S:SV0̈́D,$5@2Fb~trjR>FW 4kH|Y(1V6EbcM* 6YgSm`U)c Nc&Cd=ꬮEjvpj|Հev pҤT:,AqAL N hJkC$K{)2\TZ\9W"F(5*,MLH$A T.\sIjAXJGcAe5Ps'L2~}؁"Nn KZ cX$S ,8;`THOHi>" 1YŹ6$N]k)aY 6}H}m(iP.ܫwp`=DQXPXf$^o(˨{E0m启 K֣~dE 5.W7A ZjaVHOgcgdpkf@]>w# `<~0P\1x`>ePByRT_X/P &pl/ D]{#A̱1 Ґk$ gak(hxp|O@ V\Tvi*GG[K!_b|9RG}o8vi NrL!ׂ! &m~o+wn*Dߝ-L7 G '}K lIo_:{ս6Y_;{-Pǎ%d㻜[v6&S3T)@heb#[|Fe4=Ё +Vbٺ7~׷kSvĽ2kI=0 {EbgNn$1?pXNtN&,FVIn$prnn)$WD;-Ǿw|!J3 V5|Scۖ)P;)L ũ.RvU"zuhHe@"Nu`ay+ԃF72yYd]&oj\z[2,Z ebSiX%!F|J}`&;)1|Q2y5_^%RD&.blT&+%0Čc͞i-HLLIsECv“U'Ta,cSAR65Ջ\DzN/y}ƹ+?Fs-^wtu'SIҬ!`KdXRAyj` DaDS!@gdb 500Ne \ vm(PjmMTlEbMŒR @lj(L۔hb$>cxl-Qu0©Fzݩ)nhy\C0ڗnSKP0'-YԜQ17b^T^3iS'/ s٨=M07 QH[#!0UTt_mA$ _-CC"CjwE+On]؊ kYf+̠}(.#Xʚu8ij'3 }oWnkǴd?d?񶰍˷_fS}*%4ahW@S6Ì> Qr ;On  |o@00 mU?cXR|v±au|AA.-8IL'gnhH*+hJ,[H)_X,U,kC:LS%kU  UV8e剴uTce5K*Pَj#x;GGWypWb ձ."'ۖCe.ϗJ~;\Y؞{%^>:L~&tv>i٫?~wMFUD=_CB]gG9Hczt˜,٨i8=q mk i",;l:GnO.rưGi%q^sUG߭tdK)|Cѫ}E6=!/uWsI4;lթMj"` P>cDieD1{>k+@у wLSfQQRWZ=^ 4"0UINҰ`[7zKEo>pTL:3!| 5tJ îkma⃢~*@XbVLI?S}Wܗ^q_w1K ƍĂkbe*w-c SbH5B^u_T+ë+sZ˦-ũI!SyH}] W6QԌo*#T.࿙JuD`Xf0,/j~`'"Z})Xb.7) JQK RPZf DQrIaWEzHD@!|DUڗk$d.H,ƺVCA cTƋaQkb N8 *XĪ&.&c~#ŃN2!]Ot-*Sp|;_Ů x.~.* 1a!$CAU!\Op-*d Ďd' ® z2DjP>JQ);DymG/uvB=޺񰓌 ǣLS#If8ת<%X*-=xS~C^t`ŶԠ n;Q 4=@qL@7R@ً*ԽA K:ɾxea$Tz<@[!J -5Ű  /?^fEqսqj!W,teޅOj5>Xݻ4#}2aUXL@H"xfZaV~Uʒ --G4C^BXa(QUGxց-&19dݙ=k*p^ԩHN=B^L)O\Ɲ[7綜X()3`:$ xj4IhxܹT"?`8 |쯲DQuJ(D8i2oڊn~FLu]7uJ!~٪x,RkP 3S8b&0Y  ;*Q0%M8KM-g9,j Y .TXೄPXGvNÁq⡎{q]uF᧋ůkT&*KngcܒZbJ^0Oy-x׶ZK\j9>{сO t.z~+mІjskE bIu5GB4sQiʒ"-"Sa+VdET>98ihI ՒW QkY+^OE%;ENI:P}Apsx&J~hDmL[ a:K!RѼ<h{n|'@ H6 >N6HoB0nLXKlj(LoWY0!@hQt,E%iy@V&~weNZ+w)8=}t`o$DRxYW,Ns vϱ6_k,jJ*֬.qmWL#<l*O,wUmt؊PPn*c)p_uy@^7syWm'/pvsn ΰ}?C`|=޻f_a)WGװ / ?s{e7ekϺs_:]YoP9# s~~j;rYmמS84ulWt>8L^n8ܻs}=:cX^j<[Խ7 ],fjh_ϧ>̲ɮLӃֳoohҙw޺c סN|tYg D _'S go/^QvA_9Nˁ(]@/{q_5K2e=i0{Qb9ʹ+{گ6d$#MOPJb`Iv*? `tx}=} p%o_@JʋȁXN_Lj_F/{pq>ߛL*r̨yȂ*^5s.dstɑoQ˵C3j"~M4#ft贄?fef!'kOlXc&`iw|fs6ROlt dNgU!3½}->YzC.}> 9,lwv)AI@D K}x7ڋ8^ܭ;ez ?g|?Po߾lc{D[ How``~n{KAEb>ғ{{?W'dލvBE+qa] :FZYqI NV-3 b̓;_Jnd/%h"mBSTZFa|j9$T GďN=3ڂHS55R˴Dp7pq'9(&XcHO>'߱߱Gрij&cGq1?{ȍŞ=b`rbq ټ$0H6VF$N俟b,n[}QǗVH_&)7FKo3wTل)!bwlw)O)v/5dѮ<\./U/g,uן+)%g{|*[u`Cw=$|p6߬܃.諕)NA- Xɯ`e?p7r0!J-0 FF!JPζ`M>ϋ2 pAAJoP>&Eir3{\9k;S*)3ep9S?28SgJ%g?O;)p֦Y"3L,".AJG6ƈɁoL q1}SwMM\sV'MnwKL5T>sG9cťуԗbO^ BQoK߉qohs8$n}JjiZi5̊q7 XwX`3+,QލJT9`J}XDE8p9$7VǏC8g *X7KjcM|s'ZZW4̣9{0C8V\w^1r84q*SITЁUf$KKR~w`Id5Vl)N$10*e.23iI:\ؒ1rSS8"yQӸ#,H"rHi^$piS$k GSKa꘶0u+W* `ےXpxi}xm>$MsoX7r_O) y@.aXs O2orPiD \J YpCOPa\p 7Nӭ"T3I,j/.jG=G}C E8"SJ@^Zֹ 7 `'4 4=opXG\<=\xt)?QDN2Gᑎhgnt"68X[~ mOtb\#L6 q˳)g7< S_V]_kvqRͅJ#Ru'o AG`L*af] 6Bj2!t`9f䩆(|}a*#czW!3)O$EtDS-MyF`[t}.(Sas 4#&]j)1h|ў0`VdZ!9e X5QIƑENrobdhHƱ2aSl ̔vi(&Bt*2Z!4=b2]'|HuP ))3V (DSHSsZ(GL,*`*esG!n,"BT #T&j:N y[2vˆXe-m<#*]O;8VED?Xj,@]2$QCKs9?L g:?,ifw6 ~ӣ kjWo{~[,©|~Dyb.0=%%A$YC;bpf= ,2IoEͿ^,yX{x>tpKr0U4ZO lsM)l4@];\$NZC!XUQ[VyJDXH5ꂡ~\| 2xi\UNS`j$#H#[s]`ֻ7 0}ͿZxp/Q"f~,蚌EoQ?f͢:`5<pȬHcښL10GӹSeSUJ2 pI`C~X Cnuk5&yD"vG]~7?C%a ܘ-a'JKر2o[(H~c q9&ыvfdz:8pm"md1DSd;d/eNq, I7>efDnǧ2)5/s1߬8]O4vBwXB2@|#3ohBzI׃Qwy[XP >N"FuOBM<ef!l.InSRg\;eA^P׌{ƨq%C &+$OLJwA{wjEH85ձS?WI1H!'˄!d-L2!KGm+R]!&֋G- \ [W%axv"/D?`jɾ@uD;,-H `샋^Z6&c+߫01aۇ抈y-t-"ǧ)ͻ"(l>v/|ֻ @}ɧTW9}N x밬Q0L*!IDԤFynZ* )ՄXL2#r#Ѩ_mJ>ZgFj ib\;R]&!P*|}*WXm5#(ZHyEMCh T~TVu]΢sFΌߪLWq=U'4SDV %KrHlՃQH:xvctTTڟWffΨմLu쌍0l.LZx$qj}zdx>p\'6sze3diqɎ6IM֛]׀d>3VzےN1-KzbH(^ɹ?]5?]p;tO4szñ4_kpB97lV+7fؙ1faڂߛcp n6Qr4HτoDn0a|q0ʀh*],~u CTɯ־*h .*~z9(|(S6TJ3*RoQJ5.%~ZƤ-%`0a}QjAдH*r$/(_r°gډ~|a*_|uy*/^[CIheҫ4Vziz5p1#Ijh\?/FW!U2O\BhT[K9 /_O?X)B7{5?fa6q8y*uV2"UyG}ۇ< '7iqJHJTƳv n"Yy6q<1o8dVBwJCFf9:IaD@@&:̆P6 `9ݨ C4C}<&A vx/za]I=G8|yVtA鮃]&3o>i8ˈCB}p rs*ż,QmNo_jQߥۋŪ5oW=0J#ٺS -vHocYZ?o\ -y%I͸m_+}騷n Z3R}%j`'|xݞcΆ0lnu[47蝩B㓄ڱ;.O# y] yB~z?ݢx''Vtl$dv~ۆjߞj 7 ߖwh=/?rXk/e놖3 t?n6B"ΰuZq)Z4uˎUZ%; ^`+EX+7R9HDh6u-pqgVmx=iuYrx= B8\. JI**&"m)j}8}zHm^^.s(0kÊő 1rc+U8{R};q4PQ >RcS?? ܯfeh>?91UC rf΍RR-vmPi$C-׾Ѝ,{ kSuLsϑK0M0)g$A9`!)at'PllfdLj;n[g ?L=sx֙(lBəhh3q@M2 aՊuD--8±}8~MVY?v)xSdOv \vΘD v ۮQjV֢P[Υa9f"㲔.EǘsYAnb|S2rt OP\) !gϳHBlrD ̓ t&C#44EA(nhI(<_|K^$nHپh#׃Q,Ci.Z,J_&gyCrҨϖ\1ܪh,zapTg  ,K0bTHbtW]nsCe:`!Jm婊^b6JZ/ ):DE&E*aW,($ʹ\a1\ESQAPUԘ 0F 0rBcO|6h.FQ9jg0sģ>dr;A9 2ku\Ix쇢%B %ų1+f' zTe\BuBT9ITyM֌)/VrQ.Vlo,G%`1eout9QesdϾ+ʌRI9r494ɻf͟s Y$ A)і-m J0Y(h#w@j+=*r.xM}#F}ζ62BȑrDBȪl4틤lBʂ>& {lըjTG^9%9yrlmW $яZs&َvB 5r 7J8W )bAe漖|(`lδÍ%ظ3cJwѳkl:WEJd1:$zr. ,~nCN G=>ךB afM?7E$țف!$]DKIJJAΒwm=xp#;[~ɻsv=P\@ yvI^!Or|GVłj2y K;:"wS#YCއnFѷܩR- BB*x6ݓqڍG&S,]_+痋tM\,o?𑜆UtڭmjvH %ѕ"y696r=]"#|1[2IsF3K]u,"\_qvU$MqZU}Ε|d;len)֬OSrPEz_niޔߍwoLч4,xFԇ$ԏA#3<@W˨cs^ jqAB"\W)X'SO=3D*ѿ8aNa*+0&5EV7"˽ZÓf3ut/Ap3XrWYa9_a9iF9o OJJE[Y51gZ!pndXctOTZfVhVL,y[;osh6LI㰢bk!܉Fro]IPBKSZ;Jz! Tm:ȝb\Rư-#*w6puCƅ)ki+]RF1ᢹ8Α 0)hqc|8(~Q3r ȹBu̩J+3݇)*5h(3aI6*orFˡ< 8/le3)_Q=WcAUfNpIzJ09 U lƀ"r ;Ppk$7F)\ފ$B "iN0Z$dBX!tM*F#P<~#Vۋ"V{3NJ{n*`UTRyL~x̮d=//]6@rs UEgr.Pg80Q>A9da3MNX'KFPCF*vʙm`D _<at"Ԓ*緸<"Tа{KsGB=)(ciδTG6=i:IXS ثEN7rKue7xhe 6}y{b/ "<4HZOz{\))# x=:Q a>`bQq oY1L2@Ӄ4mr11/Up'.ꫲOmYmz_Uown@9KU} ,džC}O]O9m$ho  q)O50 ~Wx3Nrr=7..^]VB`w+&gGfԸ|7V?OI <ICl-;y|,l^S}@e` 0I?{8cNY}t"OIaRɝ-g7԰`УPvZns jYpu[? 5u?[vH_pփ j]JUz|'277.?;qL*#N{˞E뤉_| fӒ$V7Y7 ZFxA*1wYO)Fm~jwyA+,vu> 8eȥ8 `n~ r*_~X1!QfLby9*ɳl0JkÔ6{m8rz⭞8Lȩ{g;f5/_ux-B+ExW82$ҫ旟}{w wZnBYfq'1"_NHtu&3ߺG7ZC9Bڨ+6iT%fmo&) AlkDQj5]QBlYv9𻗆.>^o~jRB2!UB…4&z9_Ut:°Z5?[˘G0D1^\A9IAYc55hL{MW Xl1plH R'(yÃ2FASf'F$s75{=>-~j_֛C%0Ly &fգ 7V狰^^cq%-D8"iz|uNgGJ TP"D_Yub[:[B*k7V,f6i<|lhc%vL^Ю`P3 &٘'2çȽ 8#>G2FѮqǧ&ONzY9ǪڵG{ np~ƒ2:Ѣ.Cx3I scc =q߹! FJTnw5]lglaޖ7dC?V6dvav/avw7+ JIf93K'EpC ,*-0 N$r5CP%gٻ.p:πy1[,-F˟gDK8zz!@FUt痋tnߴ6v|͢g7Wl}3fHӂ۷7sF;ڙoWm9O["HVHE=n#GBeْlX`6㙇]ȫQKZcHRERʪj=cILVe1 dQn_].*F}ȱEeX>| _MꪮUmu賧} \n=;n4ND>#2VJv8%85qw tޛFY#xcO-m8'32i#G*:+$u5G*JHaᨌjiwB"PZ!0쐿)傫d1 5뢋//;OICF:ZM< 'Uu D =5<:z{~+:< X_\r=$Ԛ*D2:J* Uk;rI% Z0:m`R\d;t@MAMgbj[i{~ !Tlmt~{uLMn˧fI!ȸs{xpooW o$9oOvfsR?222ײ^MǽgNWd!=1ɻF@/ӆRbNeq⒟eOdI6E3W7V?kE8oa'{svq6>KȪ<ۼxcy@r9`ؚim7flKM㎨h@_S?}4?8= t24ܽx5Z phVF-]rס60lt3b,ܽF,`#H=nU'Rꧻ i+k F rs\-/].C : ɷnE_g$bۇGd?}'8;/zQ\}D&HUX'g%"Q6y*c2{o0:; v]v~aV*u_T}`ݰ2ZVg x6Z\]ݕB6Z"t*to\3mBlsyQy TJS+-9NQ6u9/['1噮є|-6(! >vТ?gIKQqԦ/Jyy#nDC*4(fϑH XGېmG8 5[9*PͧщˠdD黀iA+*a:M hhsPNd(Nыь>!/3$I" c}eeeeSvlEpVKe$xL:Ȉ"8½`_e[*—wguI$LJy5Jp89I}}l1̬2[Ŏ=u~?ݢjc(|^\Rب{靧͜tU d|^0#+Ca6FC1>yghC-zpYdO8Yׄ| 7[ pfI(ín`|"V[߮G1[oQ !IyM"SMaԊQsݰQبfh|,dxǬQ0XBfZNj|+oce-%A(*^}  Zڀ2DYI51/V7K10o8X4X"(k **܉"68?n]kk/Kvw G Q5m5j^BbzqCu69<"Dcq88yp"h%}# zعFp%Vö«!E4?4s|1JyC7N:F<8rAJso=[[RsQ]lWM~)n{2xG]28@S(B&X{PWk N`jL B:/nD}Zo*/gL_hA2>H=SI%ڒtRiPSR#bLK4 1\ShKq\bvSh;:L8:PR4P_U2 m5%x~jKܮd- SKLK~^#%onK+*1sDéBw:D;֕gym$ՂZPtgѩ}S4e>3+C]Fה6Jr$9-A 'hj#DD:傒Fcf|HH[ H a:|Ykަ6*ETSR="mM]5uGé8&=o>ʥњ+F/Dj˵㼯S8 p# B4sݴv(ch6L7!!rKy_mr+H%Y{IQ]m0]s4q!OL 7NW2ڡC 9%/*%;]vtdl!kxxqM- "_45r @ց:d #@Tu Mh_K!餞Sx=G |[lZY>^#% _0}3F کz e-/ނZp6biwSZ=:oΤ>%(ZNl8 Bb@cᐃm/`Ɠ.`G P wchG^#i0P]wJ-,Ch~ yiFǑv5ӥ]+Efesh9j {˔g"΄jR-AƢk 8@.ص0FN{b+Pz, kTzoDϞrPQ:Y`J2\^))f"<ksJ45aJSJ'Dj {go>;qCɐޓ ^4AT]e 0!3{eBN%Ӓ4}g8JW?Rem=272w{tL*8"kTU֬AVSPz'b֧O!9MTh :vGgTA+9VyMtW82-iUO+*I9V\ q]YR6ROhFH}:-< /zl)+ [qu6[ Zƙos#{CI)fO01oPc%(|>%vm#j%NW17^71Ror{σ/@ 1NiH- @srs¾6ܦ%nF =nQK x<+br(/'Tgh#gOc'9"q2Z@'of;frNAv6?wżqӲ1T͌ZOA,v;;MD!GLl(dkӾUr I@O)'B$,! '$#KlHWIKUx=_v{f{'4BOuŠAC&$ z ۫O`*/&9U^ِX6L@>LbIl^H|/$i@~+ph#K `C$MTr `/ͯMp)P3QV@ˑT 4YLGFl1Xːf=1-zfjWY7aN։ÓUf?Ae=]^>`2hosa{d>0y:BU@$ữSTT2+u`;S"`% LQF>}P$@yM郢D+Z 潗 =rRu&>Ȉ4Hµ-"ԕ٩*3AR ` Oj`hZ%0yyT͝ vCd3^rR#o+U&ZEC dʩr 65Ao ́hBN(57kTљmdF"bq/%3sn ݌cZ`@җ9Bo[ی}a,X< HV_ibWs#NfFܠQUM#b"ȕH*kNZ.vhiVqfh/Ŧ a&%e9WPC0*Kw/|QzBizxI[3|vf)M<^~pr#}Ib(1]XeQlRi_z/2Dcj̠"t(`H)"a>"uTz{(D_l]1¼WEc@Rq2%K1j MΎq! ) bjRFvv>h}zzF ggBoi2@54dQr7Y^VME$:F(˞ r B2 Zi \L3;" (f&1דn4\ _qCL)=-XvU7Ef"bzm>=9do.g l"螁k9O(5m`H3џsJ=X ?;aS(oRX+$Oo/}|r7("_e:o]v [ /zHMfoZ7Imn݇xw޵?m#E5?]jGSrTvW?\@uHZ=ͦ_H"⌕r<htxVu%^Z|<!$EބI*S.`LPD=3ހ>Ei"P1X$ޱEwlgDJ1u|taUyi`g%$MBfJII iLƁaaNX"R04`Mbq F38HGa81gd c0s⦗ i X{7YֳM C0*Rh[cDe4&т`Ü$jF"a ws. O=<LAFi 5v<Õ*TOl;ƕ}ե2c\^$@'"Akad~l{ Y9(d.ɒ8[j PPd?z^Toț/>)0ϊx ~B0SxG7ँOo8nS32Wpw&+D_~h$L(6=S5pHnrea/ G *$l@εG 6JQ*> s8 VRa;]2T[_P/TBtEfMΒd t:6& 5!5ދ P G..X]b&cOPMX1ZZ-I'b;K\fT(BE w lER"d{n5|?8$k '-[{˩{Ҷsp{[ZE׷$ao-K Ob%%tc)Z7fdZCܥ(IMe&c e>:Y$թ jGTGKm| |7go<6#;X4Vmwww/hZLl?q\lr_(cѯqc]+Tm)('g(1enX#P(YA)M%(~p&hC_ų5D58<"z|-q Ny-P4yD3^g30DWzj5WWjTY{Vkpf):}vVj_v_ܒwJ˯ -v\I(\LI&U!-Jy. )Hojf>vMRN)g5aH=ڈwwhEJWe#^8MSmB+<27}&cx=qUE$~(~|Ǝ/86-6c->_[Yy_zskKa,q4USfxe)1h41&$E{$&1*j!iIScLhO}[x4ܑBi&CWD2Պ)1ljHc0!XzG%T w&J2Sx18Y+o@Y _'aDžt"~_<5(8 ѯ@_,p;|ܑ{ }wcScq?$Rx=74N{T 0"E`5vD "9AD7&u}AVv*kP R3iObJxᬱS'Zb&ÒhS*DkpDjnԂ)c+J [ť7dgs@?eA/~ ~h~ ~tRGއc\_ɼ<=_2u$\064O,ܣ$%zDO&x8ߏf?ۯ&ͼ|S?+ ۉ c^?-m95oz'\ӣۯ~B }?z;Xk?~o9|)$C@SgX =p0j/*PD踷6:L);=mnO=wԿ.{=Yk@֟}IPqNt5@m|,4EE\b@q҇.^cEb%SNR v[]8`#gxZiȄz1^`rgy?L͙)%r/1} xxon{yތb1Ξt(aZgk]%uWFY8}? ӫH N: D 9|& u8ELnk j˔K"T-}U1w"0 &wWByѦu]}V%!:_9C_/9_+)N$mч"Z͔FyOmA'Xc$JSmq YωdJ TR( (kޢm%׺ ״>dɼ"#rWg@-O'v6OnnuhA?ړnMm\Mգ!F_سnAՉu܂Vgݢ-kА\EtJί;[7D[,eT'*kIhϺE Zֺա!FT!u 0`b1(:uTn\sNiͺE Zֺա!Fǰo݈ļ[,eT'*{S 5-hYV|*Su~Ӿu[,eT'*{'nтnuhg;:JHI!2A RzG,*5[G,P `# %I$#6yZ0y7Mgwb/KXBڄar6dJU~-LRqMMMMd3 ׄ\S^߬Fo)xM7kf^[/zsedR,)l=gueσ05Y.^a4R^R]R0J]R r\:: 3m;HTz2Og"^ #Tb/ez9ۆuU?N=cgG&xip4ʣyG^wK)43DBš MuMZM: o~Es&=0YI0@? L: {`~P1wQ]fխR E: rYJ$ ~MlV,H 4Eɉ5 )~|c?ϸP1C&%MSmLE3qȍ{8 .hZꍎ fO(V6-S|3XlshKg4mrQH{;%Nf(he SRDcZC^YeLJ9J3EN҃mR,8ŘyxIDĥ(lEy1HLS&7fpj1+R q ~MROqx6it mgVkMEȮ&>m'1;l:ŏ1iv{,(8U 9USՈSry!㒁 7D\\b "3QҘ]6'T}34w3 X\ʴ$|̹r ?^U'ϐ}H_@CJvݣwHkJ$IEº/,y*o7͡wYyڅKG*.Bs.4ͻNMDU?\@?ܲOAeJX)]C jiuL1][F+NF_ $kN f-d{|TDI[SDaPYU_uu9b| ΍UZFTPdFMwAm7+U';߻yG]tޛ,ߨxV/Q`\l Z]`k-PHK.dW.T%Jx\tvgFJV !(Fyj…Aɍl/2K?|Գ5,=V6vʖÇK7@Uk]"Savf"؅Sk:RZrũr \v\GDCtspj-$ڟ؅[Si: R(S#h,:ˡƒX1Vڈ6:F~gQ&‹bcCLpm8Q )3aARc297$k#-C7( a[  e!@%h?hWi 5zqth"M4 Km$ c RbJt2R*Dt"0ȏ9ʭ~擾a 3|GZ2*O4D1 ~dH 9 G/Pz5u⢸8%l^V׸^&EhC!h ysT<;˚<{H+%Ag\ fFH an M~| @FMRa;ӊG "9.0hZ0@* $gk,s5p#l94I+J ɩ"bחZ;Q5r;k5}4x܊gT<ϼvp\Ip*uʲp#?FH)يى.ט\MV_s,PBc!7 @L } (d `8t!T0`CB*- cG!3vh t Lb㍖(f(mEW틗=vmS%mIn=L\,v/lw>qe秬={2åi4[ܲb)of]i5ʐERk>}MFĴ ϡQVNn|+9`hRHu}6*iDN6-m}"s6#-M)_c;b;v!8&^jצޖ jOL5xo-ňpyDэ'XZa?EwMl6X#; ~(gcSއEnCOʤ<GIī2`zݣTVݡ+J+%]ߏx*Ycٟ{l!T`AX@ ȑ v۝?|_kw%z$͖G/9`0WRH7otolÏ&(I4`'*کkg@ogjgsZ/r>βUR:"Hv Ռqڗwb28>G,C|?+EJ*E < ĺhbLh=QmpK?=drfch Dmu%Vi|\4ŲS1oQ2 EWi$[cLde `~}^8y+g k^3M>" 0/zp߻UJv׃Oq?!tgm2_Mp@6K&0&w (WLJMIߖzfyjg < i0nUo~~$Dw=?4t";j:6n*3;_|:t"cp|4hA4B9hcƂg P)N |֓-Y[d%PS;K MKnhhpld{C6О*嗋䗧i{AKWĨ dOlԷ[gi4חd_^t_{xa_?!gp5B($ɂ{'dﮯ[cJ$u#o1<ˊ bæKy%8׈n %;Jœ?_Y8X(H+Z3hWAlݷ%:hVfIsZ Ҥr(d" AMa*0_LNOb~b&Hʯǣ,} ЭBQnreO=gkG(N=8:VdDZ P>gBIga(Lil@g7Y5ms${x#A9+@b D D{ڜΝYH6r{Q>6 `bGH*mhj$>SHD-)a T HˎU(bh' #v1 i],kU4t+J@-sao "B9*(CN;VVNh %gsCښ00"i䅵UTQz(d꿙hjsN˴&dLRI,BϤ}?`hlćUOXreޜ0m2i͕-BT;4ʫ~| R0yKŏc~Xnkxrv~@+nHS 5*ϗn ʼn3fh%4`/P!nxՉh*<DLv*S.6 /F/!VXL (!fIyqWJ{HE*\vPk9nF3?Y}}㔹闒g8M|OCd!]I!pǸy˧E=R"ᡛ([rJb:ߙ'{!5'x66{$;  SC>^TR͜M^"l(zRĜ.]@Ӣ5BcS; , V֛4?8>%ˡb7YQm']LGSEtCmN4=^AUI@r>n9}冊]ԟxw}H,Ds x0jy\r9oon&,-Fo?^>~wE#-ùWtU` l~ͦUl^ڇz0؜mZ;e}(kL>ynz5xk@Ef" s#PT.9=L&MAuRmX$Z4~:=$ օh:Am=Yg]棿7bXj}iZ??N{46ZL;&Ə[7ic4Ji&Q+tRdoMVTR(:>.=$~r` 5Q}u_'=ISR۩M56֎ҷ}0a NftOz׽7n_8ލWiz~? l U2ݸEG="]] r:}ʳ8<ęÃ7j8OByKt.3:2U 8 ZOE#/Ҙr;7uwzHѹ1|:7 's>UL8,D&%p 4zv O}K7T;OGwjNZj`g+쒆;9P}pcg!݇nDhfiBrcn;Z&끊%t'%wT^Q"yNDdWڙVJ*GwN7E4 pH]ւ @754Œ(Zh^x"Z憿jZB[Ɍ%@ nA(ɲgd2]cEqu2ct|iTQD*'n!64d@ m4^R ̆ ϖ)Fl5_m`7liyÏSQ;V[hh/A0S!67GflVm uLhwtQ N׳[z_/o=+٨-ƽj)tR4i6{7=/@D?,'/`v שIC^f3-{h>hzw1<'A&d~h|>DX2!&i$po~~ۅqz#RJ:vh?%ZS/aצCuuPyC] bJ39}Nyml6se?N' 0}?^䙝s2.\ͪ4P%3^CMCnN@b(u{Jl'Fȫ!生חVPN~U@*- W YN5`N*E%v]daGb A} :ո 9Uq.tv=w|%yap#T;X}!hvQ) $W r ]sFWP֠>^KR\.~XpJ)R&(v* @$A 8zvEkMro <:|;sXHbﺎU} QIw&< ث;}wWBI,)"+}Z =\nXw#%SŸ#ղqqC>OH u=GU;)`kJ;f[FS5iA&Ht+?PN18Yp֝' 1tFl |xZˢ{x$NE~#,;~Fh&M (7+#K7,"r՛Z.;,5)6#E~^Tu$c, 9n z5c.t 1d1+QU R>qWփ9`ĊhڐW zfwPJ.к݌; q{NrA#񽶺 蕔N;zX̥lSOQ4'Ӧ$xӀpR]IKAMOwɖ0mEЎZ˃U*YM*Y(ErV6ʕ'OsQY'݊:VT3øz+'9Ǎ$__a0O$ye]LN^`wjY-'D:8y8-v\!׍f5JP6j%'JynY+ʭae5V,pf쿽:{i3w-('THZ4? ¢]\FlwgDn0:BbapAkXI$˶c 腭쾧T]':n0[C)C?ǏÊ:mZ!-j]tuh 1w4ɤ"SΣi<*c\mH5|+)D \"6իqf@0^ʋiAS[IWh9XzWیMfT`qHL q} UxЏZH'"AhΑNp@HJN5&bE˜cꈩu ֖H:98.xiƍӋ󑘕i^\{Mmxm69l*y ̯Gx$pdٍag{?.J{mG]$QvI;G8&8e ԁc0=Ej@ݠ|r zq=Obzs_߼z~z~iVn=6E_\ˋ_\zqvvT6uo3HSǽwgŘ;lي;o8~KnGLm:bFGV,g acA"{LA]8 GRx\>'ݶ;aӁS/<Mvnu/ȗ`{Bs0/@`޳YY3r`G8}rCF 鸇b^ƨiS3{n˃WWV-҅l2?y</C2s$v w<'zF=0Β\Z(fH!eW6Z |UCQ?Ф>h>h>h֮WQ[iR^Y1 6Z[ ă~PPPRuj), Ly>B.VG/׌˞5.{;=V<7(.F+KD^V]3;5㲵SM1y6?`ϖydۡds>׬-d[CvEdv|Ϭъc-+曦dk,[?@-ì=4a*Zr6dBfOvE3kif*0= 2:Uh߷lK-?d_^[i]_9Cvfk|i|ef`5LN+R)ٔ*@0;S"˜wQ`?%J3@O߼8 L+6 HZG@iKxDe &ᢏ¬KҮ5뱝.{jnč.M|` !$b_`RtWK0fi,8|XRR?L *HGb6xSBp"_wJ3Q]Ԁ^.Aw&6݄6I?Q )3}޻-ti;+T | `0]%TȊ2 HIwM$?҉/_Xڰ#ߝxMF(3!k2#"eѦRΜ[k:TYkz؂.:E}JL`]4ĘS &W,QpszsjohE0wط3boli9P &~)Y6 ^ h0ꆨP$IŅ\`-tRTf흁!`X%kk+sh00KU5)V`4wVJ5u,a̴  mS]l4wjRj{;HhSߕ(wUS K peT aQPqQ4t7-, 3`8c;+Իܹ#Q 3M^] ZO@|GAKMOU؂Ir{ |vB68#0N6qqIҁU}NFX 7 w|m'5 7d-^kR L.8G4I>h>h>hv 7HRxAAAAEjT}@;}Cvη2v(Fc}(cCe2Ve~vT3IV嬎dp"%`J"KvwڅIw-Og/IHosD,))cRL(Bu6~(0=}Z- Ӫ ,:ܱt@ͱ2ލ +OvN3͸MZ9h r"F7WwqV>i~IWF3m]j9ڪ)Lw%O|?I{w?=fw3eGGX>Q1>ƈpytgƃ`M;FL`%d&>3g2~8?sUW]6F{eob4 D-T`jd6URp sza86}XTw%PZm*>1*8 *3RbGj̺+6)w`*f O`Y׵3C1ȁRzJc!':v'_J1hkԑ h/D* W׎gEsEkkA-HDKrkѐ̴^&+#D^={$JNzאSMwAR]EdU= /y?p1l W~VO`a >2L•\2 W̅d0QEQ%X1`apY9]&1Үylی?ugi)mÿz&ʄJ8.TRA1O€E!FI(Nrt@e#S# dPco6k.i]T/8 M2̕1Fѧ$0C~ŀ VrI^8;o~y9{ĆR0᜜n[\'9W;D?T{y=\VWhbCaMMߌ!|}?2CmZRr`/~ݞhW}ƪuɊgJ0@+J\QO-ȉN`"k/Rz_ MAsgB>/X(pD )ML5AcpOu1iuCb p*B2;qVqC@ %o2mQZr R:|?Yq] f8uNE1Jnɜ}uMu*[_kz+ TMP*@G3k!Q1N88xP4Z[FTZk<- kCKfHͣBJjVj.b.&EiaIw^)ћ}4`i=*$6hT)7)[D3s DfltZ󴁂,ip,T".w—&lLoN:q']=r-!dSZkR?{hX* rzstV (?SZԒ:̔qmFL)ZoU/9e)LD:UiTScphA,GZ3JIhw'/>f_d[i,I0e<eƗ:NSdp#ifhֶi<'>G m80h,?5}qԞYd¹0@חZRf̹"Q} K%wUl;~2J5IL}&DaBa鎝+H.'Ԯ @U9dOM"<$)|ZB}(S(YK |r"O\sDra,.oggNbV섈ST fT+Tΰps ސݙ0{<l(BRsaaaFk'W!?;$7)$g~:өI&ovKʂ˞om%@ekwmpw;gH ! BHdSI2g:o3bfo^fYOѶiI O՛Շ'j_axOMC py5C_" $(>ں۲5)|W/>QsݶhT~O仜Bn SQgeW~w:OK$]'agtn _~f,~ތU7L:7"`ч{l=E,$|Y\P j޵B '=ko@]w7' /%);v~9ԼI=3CJ!&5lӷ~hv hK >%8zCE k]=st)wj }/FGת0Rя $n81q3pTRp&Ak0݀qKHJt4[@>h85 hu EJvmcmep0/X93Odv!p 92tA Ũ[|FTN!PTo)G _0*o&]4Ʋ#e܌ _T0C=*]!!.J29LθuꁪӘdEфDPl+SBbD`η)I-RXHb Ly4GjA0.)3: )89SM©27QI3U$~" u7P2b~]6^EQhOZTUQcNI*-!Hruj{UiJfAȆ [ a:C9uc C%B3SIL?|NYtV"& 4܋ QX_59܇1̆P@n)&Qkmp &!Xk#x1H+0!X`סYD#&YeG~-^KUbJ OUwI}!AUx: өM{oVi(0ײ{$v=:@k)uAɛaۑ'iη7 =̒%g3}|ͻﮮߝtUO4K•Eh TMEOR.uxm:&-aZxk;`1A".N~ࢼ}=W eOFwF.w GVtz~2uX|L0ar:fAE0ף0*=_|riÑ~q@ rm+DN.Zu?s3;t@ԜA| jgb:ֿgؼD?>d9YIF_sw: n`x]n:6<o)?cjSPiHJJW?VpWNeNh 7 [`5qUvfޛe xpnЗ̗qٍ*l&wcu )Z4w2+8wOMbnLc*q4zE^#1!DM6户gittRLtvaS::ݍ55t"EiګmgP㞒bV'Bwӑab< =I?2ù"QYl|J["Q%]94. <VGР3gU$B:>$Jw!cg)|R2j[zn&`-]Ip߾eUQN*hD9NqѠ!Qdzb(')zp[Z!;o6F-˴hvѮKag1bwص^}_|QHY|uV`.WFݏŹ_ A0߂wj WiJ dkE)yye]# EH czڭ9S&mW4mB߳v+&4Wu!!_Tz*P-M&9Vʃ)}GO/b%ϲoVLhvBBr-)3ɠ4:.f QGd\ D0eXr1k0tJ1Ōms?=!`20X!%loKʹ/dɻu&KZ-᥿{N˹;aselVr;YҼ,يrx읒}lt0_ػFv㐕I&,6oY whؙwoyPH 4\n{.; zpų}n"*(&M6ՃJLf 3?8zYF\!~COaJ]?K[@q:ٞ(AbsN7o=sK|ϼK|O`-VY :D6s$ X}Q_X圬xsIc2 Ѯ,ݻq^5høSk9 C'4iB\w*X_GSdb>-U|i$t6#㸺榚ǻ-qElr\[19 Fh,RV>Q | qci U?hHgg71c'EgeT4Vc,Ezrǥ+~yk=l3hh7'26; ^|j;2AgҠI2Gu;{b#,Rx|e8nWTm_ߛ@ʒC/e9/{[o=/#r@JWZ$LQ= TyC-H@pMu%ךiǗR÷i.Z%a2!6,oPg1cC V)C/^Œ"}ƙ5/e+IHVFM%jݱ*Nv~s;??y?ߜz)4NWu/1%]hnoF-,,֜XløvK,YorWqo)xf#kHɼ؜ NX S&T 0+2Y) dF1L+f E^cKmE%82~e'"!ƈJ<(̂Uǃ`k`6!_;9)$n)|̱Ͻb4 I-Lya)+$i%7dLD2PgA A4~YUGHm*o@Ɯa R(F" !w" \&U: M+I/,!C*Qpbp7|1[j\-\^=rut۽wO=NjR0O j*~%s$#2NL矾9l*L9?g'gbfoes^Ǻ^- "}8#U$Փzgޱq39Q TZW@oAmB0c v`zgԊ#ǘ@#G} qJVpMz`\.n~Ȗ02`}2j2(+  ĝVsK(5W GJt%cP4VL9%" #P:"`70oc܂QKqK^CZ55>ERr.b"Fxx=y Fz7fG3}5^-_*jԬ,WwQw1s ȅu~'tA<e1y'#> ]O"CrL8BV?~]ڕ>mXx8B=4Q^$ǶF+۹Ŀl4߬in6cqە_/>u7懟f>,Ν퍈MX)o A(Io>xIG2;a Uq;X ƨI؄K%$yvFnl$lBB<HXYowtBw K/+,,q0280AԿͧNj²@iO&$LbMIc=̓.Gؗ,z/JDv$$?1!$!u0I'K}(RTba:6pwlx7fW@EJV'[Z? 㡝n.Ш~_E-nl}cW bR_︯νII,Jr=CFO t"p.Ά%".Otm+DkbC) @Q_C'{׊EH?l/svġ?]+!G{^;Ze]44 Am.O۳Muo/] ণmdYZ~4N%V X)|(<1:VI[}{Ҿ S(Xo5uUbЁg< f1@yg%6|9j $F$%ݹM k J)#MAx9Z}+-Kl~θ:(8ڃz j"" |bhZ?aG n%SUN{`)M-~ڸDzYAzUII< J|_B:D8hzqSJF찗yrK &O+V O닰(Ud:~q58"hhyPdT;W9t 4}_LN&J9 KG?ZuY X(f'xjhƠr)Txz4^ bVt:/n"+EV@e*w^(eVh5:3Ldi-'qTHG;/>QEӇ?Xr(6.oU'LoI҉H ,lPV )q2L8SB`Tւ:/Zϻ;V\?A$;B' L,X+Dg4ILD7\9-4K PTFj\Tێ.jB3jD4Uvi J0ǃT6iwYJ>p37QZ%12ZNvvW*wLCHtW;]cΛVM0d\vsFb(ԫ(8 <4()J;ʵBiaf "%S"pls[Wj \~[f#|{{맀b$oeĕ]^aqېay2_Hn7gh1}Zsdi˾;8כo._bTTXs;1\Vk_ھ$%er6z:[eS*۷#al{~WklS`i(y*G˔hK71(Õs𪣦,=uH I!Kژdvۉk|流W_¼ ],P8AB &+-6z+{stL< M'(89Q$e}9=XFvENa~FDՂ@ `6"E~kWzwI-):TsC=Rfj+%RkM$JSID), mdeW(knr$d\.I}4Daۓ~>ljAL?uX@x{&Eag:}! !G$`0?18W$ohX_.0)FAI0$Sh(b<=@+pc}Ȃ͉2֎CQ0&)}T=3m+ RL=g6P0RbI9 h 0CK"j2Q4 s-"@(0QL;S_@}`#HCkhx34G@뭄z$>0]=˼TlM3 j1R`Yq,N(ls+xZ|VՇ,~2_OWSxN!VhX~Ma -aoCHrTqr=P(rjhJPc$C%B*U>b;Ik֧#P} OЁ5WXa޷촶|#0ERM}Wl|HK=tk&HqNmLR3ޓ؏/8rCet0:i_ 8 a"C9Vigm焎nx8{u~(MRKj֐sGJ@PG"#у$*̯%Q%Jg+RD3&NEMWco׆XXWl&Uh}WSXSTcn U FaYZjĨIҜ#}4{"^k]$Kqi<_.?ў/qO~uPDQ, eb$qږzѫ^]4=dyK 8iș <^]O&E1(˪a~CNw՝o&Wog2`]EH߽I,Mg/ ۖmv' .t'.bÅ.gi}SY ޮdmV3*' @?`;17DeU+n(׈WLWxWOL8x*g8l'xe33G7 Kλҳaamĕyٸq9j'rzܖτ? S:m#gzwfV&^bàpqx{.흱{p ohJӮAfF<=XU2 kOuMsޙ(ӂܙɯ z)'`6Lg({|[t>GGCp57;EȈ={9dVBAec(zܠ)4㑶;H$yNU6:y 레I&AFǻ+ksĕt$%Σ7A09G\V7\S]u5WV͞@mH =0Q|+o}֚g~h6%sV'p$۩ֻXE*7hl~9AYw^؝}dr7JOs8̫wQh&.ё|\.^SGWq c'ܽ(?ۿ|"UFa8K×M@do$kT 7^Qg?={!5tLɔb 7!),`h9۩>{b rb͙>XӈAA[nTiϠ$|o{y6aň{Tx[>UHTq@P&)1Y3 C0 HǜVhN)bB8c3؇sb8eg"]z .&w _ǓI5\u4ؚz{X NÂ"[U3X&j+q8ԱPNH#/ zwJHM&q;C #L@dğQ4%+M2ygIR1/c*vPA<(`w,lĀw1jEJ</6 PZ$łWEWVh=[uuV|]9D)x+5A6\KQ KRY[klDExQ+ OKӋWz .4ɆȃBv!~I:$-8A։MCPuyXSٝ{1CߏD\jAb A!T˻o'6I\Lr!(qd$o_ĿcmMjJ;9è)^nY ybazz k/ցͿڱ6ԞB IMզ P*KLkV\a&a9γWepDG?4]f.W wv~b ,F0 l|s&l*%7Hm{[x0}f>W\-ISOri"&(;tAJo7} ,^G_4-CƊ%-2v^Zf<;@cC4-kM_{•g. $/ֹ?.꣍")>%>ިI`kUEs,boP2^OaZOy2x̧'%*RI46jP9нTmWAOKQ:iH(z:-ըL<'EԊܥAKcI;RdEi 3+uFT %Ea'\Z= Vh-gׅxuwS̛/S  4|<"V+ZA~wWכo7϶+.SHccDh 9ydA9+'1Rn}5`F_ T}5:mHÈam:< !hې .0N+p:Rd5;aA2'-l16Si3$.$t WWqeG^-5UN|'z;&NrAHG/>^\#+]|p깭?ɤ|res2.镇̚uw)K>תᏽTҩIaspP{c-'1vՓLA.Ȉ) i{7cMEȨX1ˮP %A$0)E3H c)2NE:%e|?ˋ|ΤJM:'/ )/b! 1.:Dȉݬ!z܏08< Z@Q3È8T.4TweUٸ<rV a6΍F6 2 Q%KU0b("59QJC*k@R!-ak.!kf!3ӹe5>,/^'PF`^7f >ձͺ>%ީpSعnXy `~wu:3cPNP[A 5lb<:IvZ]W.#|NDc0C5 'g kM4^Q{lϿZ9ë[ܞORmÑ#)`Uhs<=nDp;-wL<\W:}Ժp<=l٘lz:tgE܍;-BtQ2j䗥q$ʢH9i+[p0,?sY4Gb]x-&Y+ºWnu<hwҨאxvf8RW/Ƒ.bIblIݎ F0Pt^b aOO0F„7 -g7 ,n (j)SQ? -7 aDF `( ̿Zu|F P;R\QXM56zq2N;n,njg2u833T^u;xKHR0. bTv&yu/m7Qn'pLYzY\iOzdڪ6wM}I'eN@Ƃt  3@8MZ}( @hc $ '=eA]ng;?^y'~Q#}n~,-pFˮl=_r&xr[L+eJhI5 冗dF1+Ő )rQ²dHIK&3;"#=GYN Z{k+syco(>k{=*Pgs$@I9wѼG#cAcBԗBsN~Ĕ*f1#*/F_Ŝ_ł_,/~q\q[_!<[Cq~1z/QP1gw?qb'oC)m\ @( !%7>bRB eg@) X a<__%_%_,Œ_,7ͣ _L`i+7{{wmn0k +fٸ'_]ƃ\'$bU&y `2'8@y{&d1UEv?V9{59a_臭"TPl]J,Xyvko[J}ȂY/'it9\jJL:xg;!̊96{lIǪʠNv]_,^fzS{}yt1p*Hdž|1ZggC)2$Z(!J\ Dm~N/+,%l`E)yWu1?@} k[xHwNuj )CKU4.iN1ɨP9rTRAM4u/tJ̐SyA@H7d+¸dx5V CƼ U%RA ceK1`T0Pŕ<rx 1d~{YM^n=퐮jؼөav7f@Oo/5)?]|z.*yX;\fnnO~[Yڶ4pRf*?κa [gJОiMSE=}t%kC=;P/O^51b&NaLO|jc}K)Ⱥ]އ/Z-w̟~܅js $i Q#'afT(]P@U}VG4uE{M[ʑSdԐ=j8}0}E-cp1C{M"M5Ni^S%kvhMXнl"zvc3spUM} C==0ƥJԭr^O  zMvs61my$xٌ)oO`2X(ffaF̯>]Ù78>Z=5gӂ0 J4 $Pmk&pzmpzs@H8-˻T~^TU{?}Ռ"Gn?|>o%z=}سg5zǿ#ZQ'ǥL?}Րpzy:?}Ոrr(Nb33_qJS9ŘJ?3dc>3 /aba%C cZ1EFclc>3/̅LR9$HHs1yĉ9ŘJa1fazǜba%ؿp~S9ŘCH1fxzǜbA%A")cysb)BةU^bƘ@ ŘS9$\5[c>)ڿ3Hs1 xb b)T(1/3cN1%]Ad1sHI/ $ŘS9$s;OnO,KvVx: ~KV;d ޱ E OIH q$0Iφ,Պ,}`^9 G* U[8}9 0w\LvuXY;`+"!٘!;W<Q9Aڎ^ӽQi5X0s+M*OC<4yŨJB%֡S`VdރD(1Y&RhE NAT :¤ p-th(Ȓyٱd@> $^90+!- Zrߴd8AR ,O/D)( 1Cm'WYݛ7h껫{WWǍ:?,(L(P(Cy[ꋿ.r[|@ƸN"D ([ y^\8 mqˏIj:7t'"}=x̂I:9a8O!/GMqݻi+ssVUap$mc0<3A\2s#Mi{7…2V|Iצx0Mj#ܝ㮠h 9W A&6Ѽ.kA;kJFk`fv+y li,f3Π. "0:Cژqj[=+-ú0WxE fڙKUVkjdXBbO[2+C/'+i՛˟/TIIoz?=_J>$$^n謮Y;<&(H#3R\>& OAK.)[<Ÿ&|mr{?Gt7e7읤q8To :'ǽqg'8=Z{%MV0E%Yg{"A2T<@ jrDxv SQ}{:o1)ѲJ5z8 Z';S|f{Y3vV&ȍ/xbʾNd:pI׉**Qs;5o6)H EpcaD(qdw.RX+pCO:;Q$n3Ka6>PR4˦u\'qO p9H9*Š0`ÂӑBɜCp&AbK}9rǒOI~MP-'Gua)$!+Rixå\zaT?zj?(lQstZ/@}&YfNqQC e"d!Fr)P>[A!HdkXMJnYk/ɤ{};[N}uӴXYq8#یV]y=WG_SupiWj-+(UFB~j`3)YŘfei~+,[[,h2Y?:YH,PPȄ0Vi;Kt$T:{!.ܠqkYEsIRtW}N \4SpR*Rim( k;w rl8ٸAfݓ39 m, \uZy0A sɨls!l- !$i|т7ڻ(5jrŬRޟ8\Ks09Q<-w$Ax,""Xwl>zR+畐Y`d \ZR]Y2ט$~dpK3{^W(!bC/y˥mSqFKL61WrhU^`xW1{k^ף|sqA>}ʋcNVsFч1"K~9kV~u㫿)+&=ՆӖOyƧT $rs#MU_O2q!r o7Z6)/΃^d_B ^ k͛?s!rŦ>A&6! k/htB(Voƶ-|+m[^u9s:CP!|89,:Pڕ "\E/рVlTbD/)wn> 26`l!@%7/6  _x'_J¨ I%jAJYb)3Z F0`ʂ5$x" 4Xc1 u2#>Ru]6;Y^~,v]6.첹r>/rtN'ē&;HIe)A'h ,0w>=$ [~T|6,,? 6]CTV\@yr&E[!6Ιh L2<Ϙ$%̨Ȁ*&ByZW4P_hXEbWg/P6FlYI#c)yifY=O"˩ tVɂkr<)# R`ۈNBSo]ۈNIvm<)5X"ٴuzs4b@wJw5}ަR&?GG@m(-OfkGI 5jr 9UF/@&ZE={B {J5L=Bc3gS|k 9p"6dY>WLKpE̠-/pտ]տI@Y6Օ=̵ͨ9\adGkB{9R.Olk:w{u0vCu{`4cZ,S5Y` MQ뜆e0ɥDŽ!׊5 ԞaY:cG( {Qyf`ҳ "0g8Lm@YA, }Ax6Ɋe &ZGu_SQE]sEd5.Pux3wʏV[37֏{>& R-R**xWm,;$JI8Ga0Ϋȳn^Xás[oKlJ=Ycݹ{d-rwe.(c-'ѶsHĢꤸ 1cB8WʰCmfMU)M.0K&:k&3BC1ëOmI92`Z;o.H^&Z<L۲rӬSĒkG t,Glq1ȳpTGY P ʸb:;u4\(Af۟تS+V.$WzTp5qԍ?ܺz2;q8:6N*AN1]yFBt1|MMn+3eWMz#RUoK/SO޸O2":2dJ$f&IܭQte9&fV'c9(۞Dj>gQ ŮY0^yD3LJ#:ff35vXoE5-Oʘ6NYTo[OVqR,r Sn%OZEC߲o8>qzOFlkgJOOиL9=N[ڴ H~as{3[J\﷗|^;"ʠ-\+CF83r2?91qB؊uo I AsC q1m@ DG{i?E;yckm^?Nd6gZ<1w.f-{߆\ywv(ߍœ6p;!:?#߹Yvq71_N?GS ɸ5?YmŧNËgOn%ĺwm[Ѹ~_B=TQ :ZʒUNEe(^ ;$pz| ؎#(HQ@|Aޅ}y5fU]Q&Z$Jr}kcn[ Rg.KhBD[Jb R b5'ҡ^}^\\f)Z_ݨB&D[8qճ߷Lrgd yCÛsW?6vߝǍ_4㼒(WJ\Icԭq7(1\ZA#w{ _+T@9h%^aʕ1lŀr&ˬV1RȈdh8h'R)z]k)(^o4?y:UG0y~(gr~(g] )e‚X˞R= _"ݘee ZT:nD|jL)i[)/)a$/dI-jn-J3ͯeq;}FLr(Lr2լ%"Bh,gNZY5Q"n:AZ5ډ4ɗCR}/$P PId)B&TAȩW` P/ wBo q-ӈ`L a$8 @R/Oځ6_2PQC:G$ L-E'BH4HhpHU4,=M<z}(Ozg6,DM.#495Q# LE\%76B|ApC%'e)8.$u"t2RB@ ++͠iФJh!JtvFȰ \ZF+jő5Fl{9$p䳤84ѩAu|Cf6]7&((z:T"+jW*a2r(Hd,{2Hk XbJ!CGB\GƨL8 7zd8%HPJdxuc"C!ma#*g ?c[=g.2 7[=gϖQZ_t usj J}bfL$3T ꙉlmu[4j3Iv}(8 |@j hh}PH54Ltx5CJ ȐUĸ{xaZB7yM͛k'Ӈ˧fG+Fh7r? GV o% x*l0ɟ,8pr,گNoV:j3ME&߽ӁE Cڃt_M>ugm ^#Q.$j(4_7rx "Y>ze\Sgfs'1l 9Mt}Y̿mgÕ,,#9 dZbImUV|zG>nFv=Kւ[ V8-zf`W*׃ѫk>|߫y}TlCQ8?H{c*ΈrwĽ AK=R' X['8O9l1,˴O/O5 cM#*F@*5 L)aTay/NŌڊKaJk#آz-ַ5=Qm{؊D A7hF$Km Hrlc-6J,Z=͞ \Sr1FK\j\tFfkY*t $h!v+Daz}#Kc9hYmI#UF UGfW(hIH ;oEǓT}rs19>6D>ORMi%s!:-wo1 n&gjͅ( Q,4G]2Yqƈ`|rEod %G&#P~3j'Ԫ$xX{k[^b7Qwx*@:$"b\H%)輈$>DBp#SXd2ZZJ$MkZ*X3&U1H]tRHL JQxkL@'<8 KQ/P)F,!dAp!t3tu$&j?Eڨk`i#4hAYБ6E9I<C.9UUԨdtm[6q|[KWTq,LUS圉?=7!Jo"hh.&f) "]bÉj#eĝ٘3(t%cB[|x# "4„r}X(T&EFTD5e&+]AR,jH{6C-Bv 4'E5RpƄt.z0!+՚3NQL<\!ʨD.3@džl׍hRTP)Ѱb :m>USSBr(zQ@b uBPr'()RW $o5cJ\BJd&(v}RͿ6m6䞄ObGmYxaYesb}?o/y0hJsN,|i\iӇg( (?7Dwc˜2v{^[O'1|淛RFz_d0Щo?pq~gyŗHEoob8aJ ^=Ț=xg%?~l;\wJzu2X /SoIHI&#i4I8-8F+6֎|ؿǛsfN" ~YZй,xY(Qf. ^hF9C7{GU#!PK(ʛUNT׌r'ѦRVm)/KDrQJ.6,bFɀ`Xn$4AKN9SH..[a}Ļ)sIjZV;9= 1Yo/W,-x884}(o\n$Qk "4h&CG+0͆96K%;Mj5nW)az65{,H1oA]D\tG#&I}9ij|Pֳv4B x<߰QJ>~sDsGط>!o7)Mw}PIa >LO;I%Ɛk$Q"M)BAx*WpRcpF2p;p6gPG2CuBHh㵢A$f 1 gγ@yp;`d^nV*' UNDŐ#v$D\с$Ds! EM-@qIHǢbs >G[Ow FBEQRDI5UifGуj8\rWϵ1*;xI$%*!!+D\R2X6TԤB&?Z}ԎEVѠ%+{= NdD^zDvVErR෯ZNd -.!^T5HA3lfh * .'=Zq/Пr{NY&65^VZ)h=?~gӁidŗn)-|dr_֕o/ۄ7}(<<(<;_ݏej.ePƯp״/4ogZBtEjThou;PxRZpuϼ" +hj8$.z *viv3lYiŘrYPʾXD)Qq5'\P ;'?!a ^s3*k.s3 tB;EANqsNzL<yELjJ$PW?j))nQfr'"+@z>C&s.mBcΖѥK%#qRJ'b$COSZ"bĚN W*@Blc4KC%%#:VJ]azXp4( (N#(-MFg/PBXPr=~O1/']'\R0߻bLQ6sw }Vk'l=5B*%MbVGq>fgk`@nU uv+o8WR= ӄxrjƓ$'-&&7h+[.Og6m@fZQgxS%Uaޱ~fpxyXfFino|n)=\ Z; KC<Q\|ՠ)+zcM<*/x0&g( {C% F+*p|CuwhC߼:vŢL,='C-H6V1gCO_>0by뛯pV9LI(e&FӧWRo%zǽ\V\}qا[i!u&:c0?ä&)pt2e62 A*RX 1$Jf|h`/`F&DRF6nMTh J":+=(D%UK`|漑%W/z7h&iyT ڄJ.a4K7H:;)OUgќݾL+Ie)lKn *zި NcRMn@ֆCU&F@ ãivqNxn%,t!eD~>K1w8u"iOYvepKe$'Uԓ'W{5tY]%Ϫo/yVfs8OW0Jf .0y)'{?q"4{zmb_x?K;U+cao_12N 3˭ #iנyIp1rWxbp8"b5g(ڽ7sHύTԪs\=oiɢ ,]iT -#R )GWXO_WݦJQ: :aT[n\mXx~ذVڀ"QAi6$p,H6K@").UPBtGAwdm_O)uN MA=aUMT!YbnQpֵp*+?R&IioiZ`hn 4eDFF;OdF/j>ROAzOg跠CR5$ {>opi&tLt(܉sy8{Hjk9 E ӵQQt@)$yϜ|hi8Uqt㴩ymqi#,=>gFtj\ym#?b-qQ!Щ6_|(Tc s{XL/j@h$Ċ`t.uv}0(n}LROpQK 3%8("sQHBBKM46S^ z$@sϗ!wdpQе,f׻WWdTUsu<ܡf$aNUBx^._XB48> Wݴٙgq_>{۶82SSȼ"/?۟h!\J*{Ip6d6 $;"I|ΟP 0LZ ZD,Zy~Ӝev98 eVP@8# (.9؊=N<\K@v1:6_&|tXv/:jN*bsVSHo4%c&V%!x-HqҷJnӈls)2+ǔFVD@[1g"%e4z΢PR"D"jmxU*UV:X+h SJHw7 0J&N:()5ټĢ!"h" 4ؚ Y79S@c(hGS|b&^siL|ReHc)$(2}OܬiԾlÛȫ8Q{hQkR2l3e %z2c>{Kr]a~c$ B լVZZQbXOcQCʺˎþ.""T*ٟ=q={|=3U9V! /E0մI:\-uZh#9>Rjpa{*6)䗩|h#r@bټH]봍PZG #c(d(yS_xk ^9/Z(# {Bc͟2[7?{\̞6>e޼ CVOе;2CM' L4T EA%'*51Ky/ĚgmlB/ \Z5p8+f|$vpB nJ]PQGMMxj4\J 8= pɬ%n^*e:1}=D׊OpA5\&M|6r;3vL3{Cш)@j Jw*{67;֬6FoS1=#Ӷ֪v,;h-Pl Y @:~#HIaweIzbI}˸cg {<~ȳű$j(ʞ^}#TY"KY_DFDFі xKNCkڥyʍw!O~zC ґ_F79?PD[6z*a$j0,0٠:!47wVQG 2uzS(wCox,u ך栕iZ.W5q:5. F>8:ӗ P QbKybiՑǦ9f7i>_r]$˪N3}0_~7؞FZ=31,dyR[T{X(Tꣂ.~_U4nDbԜ\,~FL&HQLav7s3\ňpPCa#U0LO|?X~4=~5Z&l۩Fj;ι`U9}0v\% Mj2/9 fU6+LJMљ&VX9V{Jal[3l5G]r‡an@CF**c^_Qp),-;IV]F"=i~l4bKT7iSDvH;WlIDP6$[LH_?ihϞC0NEH{rcE=x݄š!!ZrQms%5Ho)OY[QY| =oK@Vujp[V$L޺ͦ$$^gdutDӑVd%C\%Bh)Z"D'A|zh(5?5TH z+u5~ڠвTԊc1f7Eیeߧ _-ؓϿ|ao/*cj2)cj2匩n6\2m4CV(1"Rce IMVqLYL!r(!~JɺB\-BqRDtN} ur}z0RTmΤ&^d~3/?~J>"S YIdF::uI _0YE*l)ltiT @ި$i! ̷ vJU 9&3!Q"j/?*sGvsA1EDjHM*ͩԪae=N821 FhcdQ#,#ǰ& vQdꩆ/?J׽ru޼2JCVʚVV*Hj/ $ZHd YX@<(\d!XsOB I\ur{` R"X l0Vyw dSG,2I#q $ϻ[\a!TS^f.0sTɏpjeeY:Nf8YZ6]c*EE)PPd)$$,8ap6]꾨RG@ ؞;&r3^]8^@1B][Q1X/,F@M計b2(aEFz A PoLkڢvET;˵UE{*/RX2`Ll mbPER ծ/K2z38Nq|@hj H TK"e%}#t $:qAmYg=,6\Q'EnwppiD,iC4/)oy#}6I¤åox0ƩԽ4LZ:>~%˧J)G&?jkV̌԰~{_髿MLb rKL 忹5o&Ǜ.O570 VW7kQb!ʙ CޘƘLJqѢ[`|ż1 ִV(*9D$D&o)RvF6"ݟn F:y6)iw1 Aɷͫ3c a`>D8J̷vs\riTt PJ[g88eH*b '1EpѪÁ9$p`]̡K;YxQp;#ڷvֈ[-0R#ҴBl|M;Rƒ ,bϕ&S@g֡4Bvoi^&e-oKP ]NjDt# aаSp-]Г`)$qۏ3i$*m%АL$ ͜qr xτ d bcp|L0'gJ8q-L$Dӽ݊+z6[QА36b{Z]LH"GMݖN:{%e[$5O YML5$$ÙH VfwSw"-AŸjI̼eO7o>u:}4id@'Ep*7y)bl-'_WJpooy#2gZaTdy=#Ũ^&ݺN'OEtjx&&?m' oǯARBxsmclqiգQ'Ix<`~2ԁqxT6aiܗ =p_s2Y$,NӏI>o8[iykp{W~_|ɥ2ɥFHIͻ\Ӧ"m%mUt5N5֪:=4ܤ~/bOHml=Pj4FkJ&;W1Q+`/o^"﮳e'N8IY$g,jD,awAPD&0l4v0\dԇkT #]vwt&kuub} `na)BIxj"D CQG\`+!%r⠢ߨQKfSg qɬF*of~cz.Ŕ1JZpMmi8kgVk/#R!蜏\(`aL֮KBucݥX)k K-G { 0z?< b͊q6Lm2wqT|)jŏ_~q5-F:K!tþPy^0b"C' 2 zAjo5,Y9?hIl%vfNBa. EX!)/,JyGOusf-Z{-%Xun.zQn$ar/o[^vA=&)-tZ*οau|#SkF*7p#ӸgbG_" ]?qZBi=)6],cd<ڌO VRHa# s mSH(m.W 6/BgpC ja0cIWy"+#TCaMċҾ+Vv@Kʛxʛ(b(ɂHNr`Qcb*=JxɑBB3A؅!د0\E8T& 2GTH`%}FWv8 Z,ژc60E9҄ ’2HMpgUIHJERQ%EL:LI~Dc_)\2M6Gn>r QϕjR, )hEV#F:UK2"&8(#+$$Xvא Xy< |GF%V|w;Z7x0d@dlUO%w9Ke{F,?9֘uYX}M>Z!8|a;[0ww<5 @if܂T;:ӷ@:Fo7'swE {bLGgThjy[85>\E[qաB:-m>,5c<O}9x黐>8tmJjбU8Xspt/m.]{6[Eԗ"'A wN-l A^fb+GnH(b܊&] BPZmѯ;z#Asѭ8%|gfZow"Q)LT$ڿ:I-RJ ZI-:5gO[7M1'tQe*?XL}zD__ܢYt<;$(Gӭͺl$"WQ߾[Kc1󴠫4#Pljz>/{}Mv_kn[[W,'>7q%R.4!_8S <5?Qgj3MMt񞙮U|{33#PPm/>i~}LڑO2zdV5|犬G/^iJv]pKS c@6bGӈ3 3iNYFko$Ƴ7_NZK]7m34iY)"cf< {)Sha(ZMF1M=XW`sV žf&L9Jtsm1<az` *(=^& oY*xkvR A%yj{C2R }s B$8O"wّ;;'i4OQ=:xDGz[5p N .gg&mws_vv/%٢gj_T}9ӴR4ʗ^/~>c| S4|GJptZ&IΜ\bi#CAF޵>q#E/UWW[Im֗O)f؊)ɡxƐ/C )yT$Fː Ko-M\ R>My-2%M|([hLߩgr1Hgn'n[hLQqv㜘snN3hS"8onMn]H7.2x QlW^ʼQz~F}{n&\Kd5o|N|uϊ{k?O JkW+ '? D-_O@ǽjG~=̬6{7$0D7+W7fO=5w?6no0ۉ%|>NШs{|Hkv9FqDp8JLbD%?,lv̭,46Lb[4|/eq S{ | 8% қA1$n.`POfYzSϞͩ'%v;cNI9]޼Z:}PxR4*?F'o ޾2&n+;ÃTyJch.\ZA0ehYk*.‡v^E7T),YM7rƘ+= }Rmf plOMo~/g`E|fu恾\ݧ:ТEZY| T0e ʪP҃,- ǀٸlF |jQ&{MrאyrAon]ԭAvJj%)vsۻ압0azb JgntzdiY'7'ϻ[Rv̭Q/3`uFdH*Ԏ #g ҥvFJ™9p(DL949 %D>Elnӊ) 5mfVe Rs޷sd^IwVEY⾥#Şg+'SbMu!W cgNde2Ps&0h8%Jf\tBs\ay&(0+PFy=GxKl%,wGY\+g5}(=N-|RyU\WqA^5rjv}WYDIJ(zE]Eg qS!PIԆ?T3׫wu}zB^],q߼h}~|m`eXS"S;J/X3Z1XhEt9O0P *퀲Z!x [[@kX;T:)a +KȄA\ ^s_]t΀:c B 8+8oJ)O̓ɻ+}l̈́jr#LW1t[}w>-wD/PKiQb eaIw)=8Rj Icػ/e=XCAK}|~}aE_|)v?~ o&V֐Z=~F pp4G8-cBEL$s~f~Hs1 ܔ7,^`((+p%>U4ܔ2/bF{'p4 B;t>5CcLOORA0*C;#͸B:eJ;v{_db\Ƙ7VBaHFo+st+& 5 1Z 4wz\դ:/Cc2Omew\8#U%K .W[Mu`f1{YG0堔1:J&9Ry]UŞ$"ʒH4`$j#XQGVṗ@ d6R#%l T࢒2\+AIjR4^bIc:կKAJ{%3@lY6+kWs4ZSF9Z ф8 _EWgv@;Fg f[l~xbBJ5: w)Z*Q|XA pp-0OFl[`$;yt%QBwtNK,hp[4Yp o( QL10DD.3pF7x [e4_Oj*T/Y$ Ҋ<̠_i f(C!1 厠E]B(J<{kU'*3C{"-(@nm{kZ4ox!v^vY=AC d<݀qǘh!Bn I3=hnC9v1:5>PǯCOJI'SXZO@ 8d MjL}=4A/eb~N9(6iH1ePr{;ӻuo4e+qLpK Øʲ+Ql˴]'Hٰ-*9q_ [¯;Ģ=x ZqWƸdy@7 і7#>{J^KuFZQD>ю'J=?/Df& u ehEYȨF[z)hp*HSQfEL P{J2) ;D9"zᡉ0RpNj%r]cx1<,]5uďTAjOiJJAYq*V2Hӥ<:o'ȏ߯oMr F$^EsSX"lBn ÷%\ NYLW$jF-Ň3+!SHDKV%,o,#G2msqͮW}GQz/oI| 6p.UIOXPX ¨< MsTMKGQثjZA̱T엱il075HPL b֊G.}J2 ӹ[h:֦|nwъρ1['9lS>==8k I`u!̱ӿ&:X/ B/7~ b$yĵDe;OFËf؜d˻-{Wuw}]}m8PH@IvwtFv Շg2l|wՓg[8~ ؋[d4O#je6zp-n&cFrP Z|f_[uZl0G _n*5G%L1^^}?T.j;QmщD#x]+|tH$w'ASL'%C*T+RM҂E g4u^:0jHH"2 Tn 6r]f_~ȕ-ɕhToa)%h0& @0L&.D<ȼ@֩뵰E)gayjS80(ҁʘDb!j*N&:[vlwPq_#ŢᦶW7Eqmg cL%FVHpa7S?cA8=(D} g,0#.xM=l8 ye_z (Lԙ:ho&5@([:-m-W@LmLvK2@qr CG㝩,e3Zbhs T- QM2BX}X٧xmSD_Wv(o6u<훓v~r1u'W估U=zFZ+w"WeH8V0]eCmHZ 2`Gq P)3f S¬sC N" ^ZR, ro㪟}QA(eQ:cŠ=`c9"@i @2,c_; v3x39Tl^uER @/oz(7l:AXZMhFկZgGB5^MF}Z=v3! f--O甂)yA!XO|?1U.rhAK2 dk#wgb5i)03Q9/U+bGw*]0"y'iħ1,%$' i'j=ӜSQ&3M_̈*G(Z]i'ʃt$<>9 o̸z2 լG^C #>)NAs4(W0fNawǴ&3^\R+Ƃ-.dvN̨\zMp ɑ/,|K3 tĥ6v&'4$jH3 9OP 335TQ놪N- J`偶ldB09XQ(iyn)lJ`EXbA8("D0)=}9z-w7vWRT Oث_[C**s=oߊ:Ih뻷KA[`}Y՛mì;ōPT4wo䉹Daک8\C% 44dX'N=܍`T 3=.lo,wR[R'UrdƦd{,į㏳tּ z%8o?b%t0\t4yNXhU>kyp/Q>_yA?ٛ첺s|$UR'H!}?wxuw?KYRNL⢦e?|&ZbSunE7JkÎJetJ%1fcL[FCXwnE6U;}zt?Gn2hc:ݎYEPYC;ma!߹6lr&2190q]`mi%p'XKs&'esj .YL(_.0,hBT> Sʼn\[̑-\uKu7C7=fa]%dw}7U? TR|X\<)ZX![ti c9Cbl0KS,DxȻ;n'_Vcpju.{W>V-#J >rӶw*xͤ'0׸<M_:| CPFxZVHv Fwv8>3,kd=v5\,I>j[F;xg'IkSk[Jџs ,AtVm yݧdT9\u9\L}vq|n!nzO:Ϛ"F#ƐVCFot'k0,3sSS}>NtCnu8Si(~ @SZ7J |R*c- L%w7y;oZcj_70CZ+uЙ6}rx+3:cO7E[bq2:X^MRJXHu:h+ %Et*(e.xZN4B)q4S:k,M1.$t牥X M$$F%X1: L߉Kij-8{Ep:jvqDk\}, {PFuzG) D0-+LVކIz(I Ap``OdF %⹹];qW+Å:WOkt7 rįW>ه=M{Z;MIPeI=m4^G1wEz)^d[r':Gkcb,CX]2aSoLvkݹG#%W:uUizRA8;ҽA\]׷~ -j'7aߓCoL|o^df(b 4v5}npշ/z҂1`a "F#fғwkۭm(> ?C(DfLz}bZotWFr ,:7s?4d<ޜW{^Z}t{LZS{qOW 4=a%fB4R4Iq?bkm#J:c+kAC4kaIɯ '7m5< Pe ٠EDK P'h F*x,b CueC5?2NG;Ea=wGvGW77 ejB?z'+zZۛ| A4vWoKlsPwtW47wQdTP9&EJ l 1H4gk'1ޠ/P3oPMC- Z} #q+\4HFȼ4NRJjC*PVrmPA'J&3-JtFR&2&%?,cӀo1RpA<盒?d>{N1*14F_+P)Rfm'}`QO$'얀f zqJ(6{e;lƘɓ#eM&?3rvXol~*]-c#GEcNDG642X%{sGQMV,=l{~4kGt+fPND0 ѸU˼(#0\s> kr-z0O/`5CIDɇ %q_z ">-={?{JunL~i~EhHI`ϙ}d/ j>jw&qZ|^qS{_|ᮀ98چ ڔ1xh ڹSarNC1C&nTr{9:hk,n1('"81%6{V'/ѝ_]}\ zPwn~5E0-vyBBPɏss^.s=+ڷ=Ѓ}IazDs~UE9b21S*06vXr6)&3򎻋r{Z֮ʵቫrd%?*^V_>km_ 1vۙ۝m-ĭ$h~=rDyْeҝV($%sxެ̵߲&>ӞXeE c!Oy n:3XqҐ̸̴ p}NY(!#@2EȲR1gW03vH#̚-K_|켣3¹DfO`C 6ybkU>1'2[=cd7fPq'@w,A{~(c,nlTWw㯀7aw} 'SԐR%56\Or_7BZԭO6y-H% mP6'.DɹR'ukA*5:rTܮ?Z T̬M"5yۨI%(zA% L<ܯP.%6~lmI2l^)+(jA(⤗|p{}a'zju;"d+mZf{!VJ[xpي +e]g,%hꇓq Ai53pc,ttB жPYAOU 8tNsZPfg9K-hG-"ߩ/M38?B}I&4X%"ĪĩLdފ ybd [53^!7jmх*"*J4W)4D(&٢B wp"e0YxhX^HN#Oq^#r)\'MwDJ3E3mZ^dM4B+Jy"A Gq35"9 bVL JxW6v1È1CQϔJrb!'ez*L'3~w)Ueqo' BC%bn hԛ'wZxd-܈2ݦbS7Z55icO-Bt;j*p X)O1O<ʦlVic(C.a8PY%'e4_R)L3jRi#{KF)gbgpaEX,H!jM9Xi(J(cД%P.yaϥ!x ŵp@8i^a E֖X6†ËjY$"'B3AK4I YSt댿Ue0"(/OFTf(a$N.֤<7t? p׉T}15ߴ%b%op4|UEV) ;7r \g+NXWldͶznp.:n.-jEC] 0bd&Gy̢I שS \g +N[ÎJvil=scl;:t~WC|>ҏNhzeԭ>; \8|0ڿa F^|4C5͘W<{L;z/g:B2JWjH]`0;eF8 ”;'0הZ&Toy~ Hpx|)J';G= d]O*]uGT bf `!bj)8L5y_HrUGg+j*%ښڣI*a6%]39qV>C+bDip)#Mx{ߏG^-Zͷн̻q8 S7\ghgMF66omAU:v]u-8iF㧒Yޢ0!HC!"l*@/OڈG˸H'פyB>MqCL&<1ƹ= 7a+h>[G(?1F'ʛꎭ([lrGTKaHd d0k:p [ǸYZ©j8ԊIԳ"U10N*a~3jq,=:)x/|q37DW樌;@ͭpyb[>mh~V5_ɢuL@{3^Z4(@W&P{Uݔ Թce%̂zX&?0{LE5k,XwR-HYkF[YgD4VjT[7S߀Gxj}қ;a1N#`lul֨+)bcy̍ں 㼅p9޺Cxc[Z5y`cgvNNLԦo&jk0urʄ8y[8E`Tm:ese,5H)קQT3>;|L-6M51r 0taR:PE3- 8lফR.%d ܶS+WV 5G3B F4JwxUf4Y%SvO +% oo^~/t,8kBp*'2oSQ;0<{8˄7@ݵ+/#xq-rӢ2dRKg YAK)=q}CQxZ\Jjh}h-]TZxt-4.q'Jjˌv[4h4sQ22 P+N5efT.`4y`-e죬e׌m,jAAF9bS<k^ mf L heez'ʷ QLuN_U 9k>laZMz2Bԛ-.Giio&~*-q2dOOH'/S\ =ݓ=2G0 7@6MA9*&-8E9~\R5yF۾ >ҿȽb0z;k;PnꦧbNQZ0 7r T=LzrlWp¤\4KW:_|'+Tg9^W͇.sJf!0FUrėk@YkiV#"0* P{2o;\0!Ҿ0(K0CsO B$ dh!c΀m+=N`?n{'pB2=)<{,V R՝RYss[0ќQNsQfA 4,095w ^u]24^up#qstFyZf/YIn&jTOyy?'siSݲ';Y"s{r}(8R[P+%DmjεʹhK|qpg s!$y>ޟM0s=BN2#\y羥7;H+Ep4`ys4P޼/;3_^ګv^/_/7< }A谏7cދ/^cw~_uf/w7W{vy7pIXz@/}6\ Ipr:϶ի`z{`2ʎO+Ei/gחY5kEn嗝px4<(p{_#7\~i<1;{j(סkc1샫ҵb(PZp H_iؙ}:s㟐箍O3w~vc?_ݽ8ƅ6d(gfMVf?f>'~ҍ:<ʑɸfhoƟ_v~?L;p7S7{36/_7֧ـ}|/G _>S;+mHЇŮ<3 c{mmgjQї1}#:HnUXL6DbUEDFWc^^ܼ6xϗ~yM8g{ynߒ_R"a'$vFmkfPxqѷ~|P+5I}wSSz/AO&}sqxjKxy|'Oy* XE;,} ._ݩGm"=~-'o!W)N.?pv۷~;>gmogdF]m|Na)}.]I3u/d{'^DC4gdZ+5SFy+m*-^Q...Oge&O0ϿG锿\ W%W1㣓6)|LF4C gA0E76drJ?O҇Y`0nc/!Ђ@D~>'S3}oG Fɍ_W-^|wO-_Ͽ2frxڮ>kr9w8v{f`{w(< V$-J#X=l `%C5NFɅelƦ6DžN;d&k(d2("+ .g<:9d.IGHãI. Sln`_H-?\ÿʹtRGC/^+ dtsdA^|G^R1V:E͋2)M nJep3_e Ek&%.&{, $UdAà\dm$]PlOz%&8߀MP~6h'k,HN]ldh )p Sù(^Q)rh 0hl0!ߣpZI]'*a!\J" e!i QXV:Q '(nEF/f䍁L;8\C{kDl\ЎzA13F8,{42:a[`>%*$KV*,IXfyڒ٘<7H]$!&ZN^2`ҵP*SB{,c-T P*B*\RpcT)HsGJBtm&%\E507V@,}C,$3N;isA ^q-\":ՎOڣyB3L/ˈl+TD28 ʵf#AC+yB-K'k @v FsNfe,"\}x] W%\pUU & :Hl%2(q5s"GM@`"}u"6NhK4jFSh4eFSh4S4Єs'$7ڹakqjVRvE R xbU  xѩ* #IjR3ČT|lM~5=_Vskl)?^E{hVu9}p ZICM\KiDv~Ugq]mX).ZHٌdK<n)=T%뙅{w n\j/Z(xp/vDS";VR?GV[V󞻑W vakg{38Rr~8R-tߎ^ k(c7ש΁U>z!x;|{,t+iw>,e=k(TL ڻhT{GA*\(U~H&+y^eF[>`E!Ma?q6hKL׼ߖ{,zfԾȋ҃+;^(eZ`5ڲ8W"FGP; w*{ݹMHVlx eҸ 2P%'% TΖ4^ZF(rCK*.IW1(ry|ܖLU ؛T4Y`ӏ`Xɘp?$]%K@d*xHD6bE3W޷sP&b:6|0@ڊ c NɮSxh C]d)uJv)ee.^9)[cfQBXDb^;7R;nq5Å*5[EÀO wsG#N## i̥>8&l$Azɷ%ʀ*'Ue@UTPe@c@e%FupJ i4@8M`M; XvJŗO^_J0v[ǂvnrfݺWnUc]% 8_t:{4k@63gJ y/?o# ]Խ>E%D,q~R}.`vFL J ^N/ \ѯq"mzˏ U)cB~dq O)隒)隒)ͥJlY|Û,#I:X BN Stx̦>Q ܃5U$jw??7źE%.M;:;*4{FȢayb4"jGQZ'DM75 VA܎$m+'ncܶrm+vsv DC494g'JnF+=4dlJhJ8d(!{wۖ3/ۮ[E*0U`bH^jŪ5`whQB ,'!8M2M![J̄(Fhya">m41$Sim{EkJk+vJ6VC44`e H͕@ !$=pb  R[E ` ((s۪V LO.mImDRf tYeז좃@(iڐD$A4RopQdS)m޵u$ٿBLWwW#q&'@觭,e;)Kԥ{i2H/t9O3=$'II~$Ǭ `SA)9_ߘAQ \JLΤ :Jv ohe_DQ;;j yBǿ?j]ZhwZ_^_vcb]S[M2 sUh1@&ʉNdru#!HV$+%4sC)s7Eȗ?-BL'OcOeY:;zV=Byã{'|7.|u\Nۣ@x8Q_/TK@2{=%h{rS<59rdK_coЯNJ&:zU"j TkD[]_W-)͗|7ߌeI̹V.&+O|ϼ|m|_ ̌9[6݋yxFsR+&ȳ? ^BOcYX:O;ڲ= ݿÅU;ྪNjvヽf՛?$>7ɢn'o^~|L[%/Ǫd3X9m.̘Xoj)?bk]W/ &pEрYG~垘 Ћrcv!otqYdY;jE֎ZE5Ώ:a*jՔS ی eƆXcEm(O͚'vBö7C4_^̌0%4D^$k?s6V=Oy뇯6::CxV_q \+vGO9~ [.)Və|ca[l͝%u` Y/=O|* Z N6$wo_ nrԊ+Λ:CJG6dV$pUth B&SQ!uDUX+ ` g6{LtH5g3ctWAg]>yV56#* kJ}*$Da27LL@e#K']JbS+Z<"Z|R!Ќuto'&x|HZ~`+TSH#5 0JT-qE12$h:d#^(5OP4b]LAm ;C4wO6 <+aeJ״՗;93Aicq|&)EK#HOL@ ՒJM 'l'ZU+vn~79i^.6“s'EsD.2ΰe,XH4M]](6S*Ԙ"i%+'#LI/e| ^QfxmL6Db>MC+ݫlі*ϻly7TQG˧ܗWx~q>v|^hﵝhn_?uW@BC!ں[:vuIu|z|c|sjsYxauh5ܢj{xj`1YhD 6/Q[ 1=;;?~{vw#lg`JUMf""AWD3F, 2 0'+њ!Uou죐.*%W.]cY3t{H~S=*jۉ&6y7\˰ n?PT$RQāK"bR!J%%Bc@15"lo{xYD~f`pt*w=! ! oKu 崴"x9 3Bn]g,n+5jY$Y@r$I5Ȼ ?wX`4XXjf" cM)M2ʵ ~zF\k? 50"Y;6L3Oزp3%LM.ŝ6̾&& Il%f@G(Tr&#jӨ 1"8÷Y Bw3vҮ=DGb* B,Q 9vE|1l?U>쓪&1'-dH‹P[qg2av*1qAh0oC J[1doFa3~Rаoe  \Aql1fM21H۶铇{_H=:c,;بa3\/vӞw PT={>]e܂ptքF A)RPe髉U$Jg.GO]Јu ʦ8{Q%=1 NòKRfaX{Yf4!i9T|cd=4*!R`2VY5R`l r`FAqآK^>PaT1.׿_.p e(ڵ ^]4Rx3^5.ĒT4I%wHb59P P\J2Ӵa>w&w4M8Cc[X[s-ig 186ccJyo8?&8Ǭ#)Gw״nΐvb/NBޭ {ӗ+hivbYG O*MBrAk(wR0$`)-Q4+ @z9{_ZE4:Vؠdv[2geboh ci͔O\df oj̫O|Ԙ&BL{3>;Ʀ]r]5G\;^f&ȡFAfbn+)&9 y(ncZkcw3sZxhֻi9 }ÀmcpP 6>2<0Hqa`9KI= Z`XBڮd3 E^BypP0Krm|j@*tںliz7 1ɥwڄt fH2, ɴa\wlwJKʷcCs$;fAY(fMC45bshKGƣ{a)-`i5=qi,5+q_KX9.I. g z7P |="<#<%[,԰<K6$uS\fOcYs,ȩu)Toj ,;_C2{(Z#7C#mjTWR XIdhT.;( [Ed .bvUCZ> |״Mg8}6*LPj%k\D ٵCƑ T^9g}38m.*^_v{X-v2wq~0}t WtʔЁ.;XV)l;Yr"QNqzmrm-7}HNYb5FDuL,&۠jrwUm)ChXuYݹUu6DJS;6YvX.OtF&( vҫʓ62\< χjmz+08A3Ӎ"EhnLαRn37'dgbN: Iu")f)lv3SˡSPHj\.L i5)ΞnC~43<-a%v֯m9 @D0α d%XS Jsz$ey Ն#J=Ň怰`FfaAeۗTN]]#[p{ȕ({|GE7nmEB^#׬k'NtҏG;B%¬L#6B:+ITm!$k|s0̰6d|ۇ` )x؍ Xo`ԊC'D.Jz @JqK^ ,;(lL'aXjLG' l{@gS 3͐!v:q~O[@cЩxA$wFsޡF!i۪v9S:`g=Tvq%u:__m⥟R9|0//K_Vo?"EQvZ}XdǎU]f$Z8yݢk# v7<ܽ !>C__?_os7g3lсJ j_uU/=-Ƿ.tڣ u2] qkymy+O Fe.<\^&謿32ў햸TvƑNQ]ΏOzP"xf9*bp]YT^\fvv~IOJ Js02;fJQ۞:Jk54h9M9'BPښA=#y$サh: ,lDkjr~:/ӪMh2ŗeyZ9yUV|ZV)CdhޘW9¡1N5i $)E.7.(탶*('#3!z9" ǧrϟ#wL nޞlSKYi娸7[޻s^3}?{ƑAԯ 8\n;zE_5II#rHpeUuuUu=Ddr1W63e&'Xnxuu2)>?5CU}翾$܏gx)It`Oӆ{{@l[:ּL+QʓTPsSiYuIɴߞ>yN_lr| lDO7og:z,ɰKM]X|'uvgey\w,O\mݵ 5h7)8rU5s* E-=2"2]M͇\R@ߡNC{x%i3k$߯fCo{!>·_-੄>-fc,e]/Pq;o2-iϹpkl,^8w(=V&puΠSGZ9ѵ!!("H(lf-~M*xq_+7igyisw-TZJ*q5·zĩυי9H.wW6Jo1r!38Hu T/L9;vPwLq9p*9 ʫ8T)G[IEcfۨ0r1t^U R-2Ή#@5ok Ԇ3YK8 n !a [(SsE3c/>7>s)f39߂vZ\M8w[fӡQVMIbT N*X+OhZ4R"yWwWn}Un?E3ԡ j^*.LQi4+ ZPk-DSpZi#LYC!F6&LmHy5Uc^2isv{eÌZt[bo8Pʤxd 2I'}ĨH "([S$ Ӡ@M AׇTp$(3(Po$E Q1-]m jHÃU7t}ʧy!U Kn'Q9!d RZVmͿrfLWeAQfi虌XGsLŷT֜+0Smr­ h\_ ¡!ֶrGhG7iSh-`c5P3b\B4{EQ' 'ny^Jc"/v x[3^/ !_Kjo([)i`3Mmh%2.#:(U.)ź(RfrfhVҮ2 oCv2Z>йx.2.n7"'M~j4LC!%v *?̢Q!ot< !ڳJ&!ښĴ&l/]N.ZUN2Ͱg]Oe)!)YkTxfBQ,oYa#A&3CRڼBKђy%aVT+2lVoyU[?R`lE%-ծn3*#iIW׉+6Xq{-1GsC?Y+N͞a>m507N~/XE" Zh~׊wtTvrQ^R I=$D꟤O}6ov7NBS4 ȴ5c*)lOUx&)ZPɳ}Q}_ 4S lV434}Vgн^K2-iFKz|; Ȍ˸S!dn#pKz95u.2c`$+ԺyFݏ?8'S" c1FU'Z15NY}{9hZxnzR=k_iDEX.0wsưpڐ#.[Sdn=,άxǁ3 xs|pGB" ߗ+cR y%FB9m\xԍuxDSqwǗW$F 2i0S(æ0m*K#ҁ7*5-I3_qN7>"KoOt8qb#\Ds^Qn.jmYYk3{\kŋuVᰂ -X'mՁ+joVǤyPzO#G o[?&!u<4biF9#8tHLBݧ=Eާ{"tf ^ƃ+x1(.RNki$'"Pkc%'BEBSW ^ߗo-ʈMTis(4V WkVhۇTC &ui=HK)Wzܼ*A~T^ .J:u#tY4õE} S Ae]zW_p&BmT02׊wb+)Epor1fOM0ĊZ&H{U܋%7|Ƿzv屉~Wy,YbJwv-r,x!xbll$R`X0!1A p%mw˓ {3biWEVLJ%o7!/Wǻɽ|p|zD2'O_R&ҌLjF&&)|@,C?0}H Уg+M#ɻ_d/.`Afd~y+:,GS1F:AXG!?Ϸ6hnR~jOQyPDGB(59L)N(/̇o|+Rh/nRt'%b7V1jvv3KΖ\7gϟ.g?1o_㙳|==/SI/..z-69J\į2}I`%w/؇rF|97C.Dd%g!0υvPYy)j)0Bb0u<*/?\zJ߯7SNk!DS8 0~_.=. i pcL ;y?bjX*.)`dL*5 0Z*StF꺊N 0[$USrJw!%d/v#Uuml7z# f;Q$ᥚ|d7=RU5u3EG蹕@q^ Yyci%5bs :OVm!xzg:,dߏ̘Uܲs;V0vv#!{s7v_39~Y Wg{^Yœ7=2̳{קX3<Ȋw*{X%*a8 `*A4fq<p$D*:_ <8ބV{Uq$."Zٍ(vlAޕN*٧ #L}Ist~ q\4]q%%{Qe|4ZMQ9J4KBIer*ݘKgGE;/<&5;,<p rOjf7\kquCI̖ΗhK-AEh!$爲6Za V -ADxDys]6`9kvAfG?n@ @vkVT=oV˟}3=6{{oqod&?< O4BY(.z97R&rEƻ`Ai>F cZDP:N?{2-X*ծX*ծcs;zJps %R 3(UȤ .й׍ՃS[ 1P篪Oq4P=8/R~wSU}p"9,Ϲ|R>d ~ Ddq:4PD+)!sR.o}thVkDxKۇ|bfjoczmi4ɱ4p+4!dx=<="N F6hUˏr=cRƳ[!O-df'OMwF!G_ Ȑ9œ*)ܖ9X~A;|O> ;w;瞧8,\;x6H;)<ԛrf#JTFs/Vpr*V.X)Us Z.OB^-Y| Vp@'Gw4_ 2`;YkdH'+#>dh.)墠t UKjȟU`3bW'+e 2}*]nq:JQQKɚ2k A L@1j]AԵ*wR,\Br" i"@)zyAyLv$OECWf w/|ɖ+>^iiHVF<??|*=ø=Wʰ]ށb% ÀMq$I%[Fs5IݞR X5~\ Yj8AZAS/ޕF#"w̛싽ÀִTꮪVOb:Ju0+I&%1TdZ;zWKnlHE1߷04D)ZZF+mEIK*(Ͽz=鹶.P;JS|pa^-u ڳ@%dfQﮖg|>P,=:?:^-Ύ.x3ۘ0Bg∀ \7 ք(TR8P9)TS8W@(+1sab|pTwQv\nWK`@@E6} ]#|xhz}PԍvAhd:{,e﫣i#H~t"t9W63T:lalc{Xn8T_1Q"/`\;e=P>Ϗ'a-ŒKоZg/:VK\ª;VnU[Vk-_زg2lC.su=Ն1Yԍ[mu QK+ :r8+lt % F!zˊH( \CǎX,,3:JS*)׌ ӱJyɄ"z}P46aiyɊJ·i<.WXAB\=lCBK+|() s{QF P&jyu=Vx{o|(n5Xa/^/A:g_*p"AТ0RU+}mfN<]q&y̢s;ق5 >t]8kutK5z֟. 55d{xRm= \>HutBK1<.@I,ܢiu.WI'817lT~fJ=d椳1W :%뾼]45 Rڻӊ&yYkʞ5f c˚uɷgsVq >}u HUCNX6f| }c,VujE 彦*AFA|m)n)ՕܗWq4"6:1ŘDnCՌz}'uǾnۏt FHu9p ,XwcD!_"}~7QR:*Tvoqh*g!e9 gQE.!r2' M`wNqCy2Lp3 ;a>9[ Ot/]g9$;8q:[fO/"qp:ۜg:>r+z:?4P|U*AKO}Q4G8r_rhҔg-=&0aH$5D ^ڂ]d*. AW"EKH8 ԙn0fշKNl/A=qP:IUvYt??{FNYc6HF#wʃDט &ZFlNӵmT+$u# mAI&hhPBIGb3K!C̡of]_@9;*9`u69~D* pW]-~vrg5JJ9 c4VȞQ J>';uG.8)g1}a >{ZB XŬJƴTUUa9E@Ӕ-/hTQI 5|^wmr]0,8f9$3F=:w_ƪ?P,v};F^h'ԤƂYY ,E^+F?ϯe}}K.쿈TEق5VE[4M%lTh"Uʔ--Gk5Uzv:BQjBcf\[3k#LqCj_!r]~:Z("Qb siäp3U7*l!\X㠚 Յ)Vn{( =kdJqe9I+#棗Hi O%-5oIĦ[mỳЩendl*RjoWgcFd;L9F 8t&~ 1n$|z,sTRΒ>K%/9\$umi;]ofu׭a+ eH&m@ն-Btx]CK #U;V=~ \-yHr,:!@tcm 5SFJ:m]߉\Iz&=%Vw/k{ {r>Dʊv 姆aόНxsqG D%zW. AG-[գģw?_G&!fu@Z+Q5ruXG"ZzP$\ȕj rlW'a({N ;x梃t)vwBvfbV P!rν* ^^E^YN $dE`m6VZ mS3)k<_9\L2h*wc֎(b1pwh*=Bj^4.Qk k´m:+)}Sݏxn{Íjw3#b$hѯH.#:Yө)ZWE " `Fd}ן @/\Cϩw:]bGs-K@PD$ty:\) =k\ΕnΊHD,#~?I˾X7hSnȽ`uSI5;~TC%PҨ7>F^:THR0e>N |] `YeHJW"MF.Bu3?qԾp̸0X,1qD2؈ <9.S^|X Us+nd&2\tONg~ D6 Y5/{~:Cpm{fk) D hAƹ h29D@ -‚#aJz)͏'1-hPĖ|/|̫(p&Iwػ{%!fC=xfh7`<)3Qq[FܵjYG](/3qhKB.l/*b9d0 \5ۋ'8ތ""U"B}o[q3G!m 0Fn.JV:o%HfTh pgbN ͅWyS;!%.HS@pC=%-\Vlh)"΁pY'}2m= ]Xi"Ep8$o<rWDlt9-t*h^ZOo_Խ,2sAxDGÙ,w; ҹ,3jϟ0"Pݐg?뽲 @LE{a 34&^㞋NUSu1Q3j!&T$hj:7 "F<A|͏02TB|L #F eyL2e9?ֵ2fd?;d1 `ӑXr+nG[j[@%؋e/헋}Ǔv<87j|>5V$20DM$ne`yzqNvCP˥ȃ0he;։\d$2Ru/^@ea\Ju0<~{0aе|~W/i}uӬ_ʑΏwz_-/WER[.n)A_=O>UO}KU[X ]3fyogϿۻ+N&knov'j\pMCym7cy(_a\gד-"y/7L^I ϧ35^oY&k.Ed[4z"S@  ҫ,AԶ8D"KNBCr@ Akת9Z9QICj-὜)GNFe''wX&etr^bG]i 8д!=C!ՒNr @ s8 s1MH@ɩ ,*Qn)YZWەbgT]4[Ax;w-jª )fu̪XIQu. 8 NoXC=]W;bU-U- ; A&y&Ib,#^:JkooIU*j!۸N^kw+d[qb ڑ{ˆ {e'y8p^a{p4߻w;gl2 TloHJmݎ 58oTs\tK~2+t\msmQgEmɇB`Ъh[3TՓj`n6uڞ۸2#׻bXĦqm 9)81,c"Fe:n)48\fx-$*DʷOrM=mVdu>ĪwКֵ{r[Y(݄u(T>:s f&s8hM!%6I֒ ^;Fh3 kZgrSf~]ܕ-u|sSaj[(a<WBQGj" VS}Dh"+9i$߇mV;Vd=;NM3ȍfj8]59!Y#뵦₈ٔ C4ӎ֕ FR&=e~3AڝҚk^P)iސQq_ʶkV)7Z NcW-J7SW7ȋ0׻b\15tmGЩȪώmNLMKym/#9{8w.sOwsvgHbhϚYݧYuh64 &${ax/& t;OFEP{r @-=2 x?`_z~5 ex?30g8 =n~ 5 'l2.^vk;k=({kMpy`t WfzK٫_zfp{^d=UO=){\u{IճnPp 8_s=~P]^<*9RݜK n.+(&l~!v{oޖ oO_3~S;4^Tp~2{5|}._y/fZY\? CP?<=}yI 穞"RA;(s j1hp֝OOn~?x%t<19(P84{rgpYQ/{ɰw;yfjg{xͯ%-K?;gg {}:?^Vu0WphPgp8p:8+RC\7L(="x=ќ /LBO'xz9[Ж8G^r?t4ow!ėWYL5s._DY@t-l9<2ltht<]MF#5ʘa2Ơyn" ~Nl( F `#i}>4a::KA}2Y0p뻝\DrGr'_5o4(ߩ"ɇ3K-^0EJڂL }Y(s F@ H\= ?=\p,{6!ft,T OH4Z^zS[D/B)f2%I^*|8w 4SH)-ο {,6N^HS:?TN=}X=A{'ui{V|zJkI;慔=@^QEIR 綗JHx·8G~8!LDq&gLɲFzaOYί"ɇW#\7{/y?W1&lcgS=EݻwM mri ĭ(*^,B0TJeK ¼X!o@/IOt h''rg集a3U"U:VQH6GՕC pvlajjJ (/0̧}w1%*Xb/T1U̩R;khf׆N6];_>4T5hy(BgGp3y4yScm>p@[΋;"Db2ƴ,1Æ0 ݝۡeySBw;N2p*b'G*#eS"y,:J"!oHqQHv>?e8n%QX1c!"1jnj*<8  σhRcF;*iuq| o8Az%3JS;k9׸W)QK=Yg#8'>:c  1"چ@ٸ:IyR@EƂQ 2Sf,jSP&Fə x `Sp*rU ccr,;ᱎ hՊ k̫uhh0B "B1Łdi#%J0I`Ƃ|Gn(8bi7vt>|= *RKbɭC҂L2h1xrtUlC,!rAxpzZ[$tYP8煋ADhR %kW(ř dD<0 tGV 3 R0p/2`Z =Ha=(($w,*qF{/f W_ nbe*(LRZ,׫c}WXC%V+^ӕ1u1.0W/ߺlk5j%kܭ>R"5pO3}>8yQFdEo&"p: Vk[,xZO:/F嚯*~9G[ugS뤲Dk7$Q)YiFFyz2Ov>-ҟ<ur< ^fd ­7R AGݑ."j/ng bDp^S'QNRG8UNhE!zT#}MEŬN"n1nEΛ<86?"/ =<3Ry*U׮0E:4 'Zi7:bh:IZi}0-) ʱ$ҖX9ΌLsx@ѱ qFhEJFgr"NZ\ZZ\Q5- (Q1ǜKM1gc$+cTJӌ+c7\[KXޙOЌL̇i (&&ɛyH3/c}H#J"bAy|OKJ<ү܃?Ȕ^gC#D(#C5>Q@a YOi< &mW; 0a)^IDž'yT~hs݊j纕?\!.[\e5~}˫צdjX-'=J&j5̬AK`D7%BYv$=GL {_y \JZ*Rh{E͢6^aL#1DPAY+7p~XBu^zxfbJ0U#}\BJ;\cX4פYHUgDz;۔XC+Y 8D6qjXA,G <8;%RS+B%6d>mE텣P)ȸ%T@5%+:zVTխXM,~IE v`k3Xǵ[VurM}JO08?D3Dz.Rogv) "cA=߽{ըJL!{22.DAMPQϧJlK%rEȄ_ QeVh+i}:ãA$0E0haO>k;Vs?_̠QDT jgoܻZO~v^Vip:m@_@Y~SF/-! >9>> yZYe)y~SYξPˋ b`2\+T; J\y7DqU{ 㼛 K4RiNڌ=߿mg?/'o4۟n&b }cf2ubqYhL6fl+o MP~#%ܫowtٍeIl]fI?C+g6kY*'1X6Spb ܍N"5c%23X Nne.0K=GQ ӦL@E,rfoOcoԧ׶=)m[^onQTxз޿Ab7uˆNeÉ}M߾MXr 7;0bTOX-涿ϯpJuӹHw;1ڻy{{γ^{ƠTY?JCz{f2YPo֣EF23eu} &E禇{ޗx_*l $y;քʥ2"WOk#es~=6eyTƌ7(/i[>|sqptvAx{h׃1<|Zͻf]XXkSHRjce:*qvv@QD{5)Ȱ ͑jۗ IJw~ :)8~=zT$o{燧y{Lis̟f2U '+ucQ͐+qR;7Cg|J.{@dGnz y+_7 P;PBMnTVT3s!:73ShyPboY^rWowCݫ~uߝW>[э64/{p#g`[Rw@x<|W#XnXo?4_ضg)[z^i^ބg!׺i=:E+#\v$+ZyF&qKG1~? Ο/t󩂧׹@6)9^y}i.NA*7kݒ{PE l-\p[~ o-\p[w}p[y4TY%}VQXr^81u(2=GETJ8hp,Ja D3MqJ˔nkcپoWσ,.  Ezx[>KIn]? 0@A셜PNT?'y>OZ"+k>P I`_'o.~9pTyU1!tsp ]'m`f`.aE׫Jٻ&c5}# <ƷW7уRnUK>)Ƹ uTx^7m̜bQr~Y ItO}jPhg]Gwf35X"|? Jamun?7l1{J;Q#ax43DJgRc_"HA0!9=3z`Y&z\TƇE_=C_t+Uڷ%/{S}M"x)~)9a{B5tBo.NL<<"{v>h1MOuڢFu"-rڼ_2*ۦ?"Пհ9x ':׸܆-tr^~v~߉{Mxy N#>VY1V! Z-b )7f/&"O  5{4WGC5)۴i$( F 㣷 ~ap"$Qjdz ^N}NI~z8 t#vs 1Tʇd>!M*Bհ=z3(C0-- @!L%ݖ{7<\v{r=`iiǩW$&8`ח ñB5MMmp Աi'UJ1ўljhKs" H}:sR{6AK„ӍߣV q3:DPj1n|`iJ" Hb q֨)-$,"P` qij Nz8bqL2ʛ|It!$&/cT..c O n¥6LRΩdQ&J2 MSN 26`6) DJ;]St2y^ ;Wc,5̇RgK#gҧز51HSk0p V2 R˩J `3E< [vl٩<|: 4]>'4;;ǡ!g4J )2x' *IP!b-OgSQ賌; ȠN(a:AVz, N~uZf)>Qv(,ʹ nMY1(U>,'ȑLK=hK 0`B$x[ dI ix( Q|YR#!XM- hD!)C)d`k{P΀3/ 0E K 0H&#,*( #Ǥ"ss.}~kmH9{ !M::XV'zHI#rHp.$-ң]U];OB/A+;nM`VPh H" FBɓ_1&RdyH8Vc*I :D-ѨBhlX`r{I'Vt0]ٿ PJ9eZ}ZF Лt|$ 4 BpUȄ6CXlfT<8`Y2DS88 X|SDIB'P m%)5:EE-oˆÝM*r !j0DRH}DTQ\/xg=5ZG0nb*-0baZ Qv@D.*^I4IZ]kD!*FQ&PxD'Em%ŦckhEѵ2EH?/fWlսic3^hSmH6dž-s h\f5jʛ8)1稼YMn MR6V-L_O1Nmw +'>_jwD;{xn(f"2z=v ylA;CT=s _0%h}0fl#yJ !DJukɻF1`.yǗ) ]p(Ac"SyS DQ N!l(VsQF$uRcEQ[Qs|&(\>SrWt UЖ ! O(^B,y &eckK8P64|CE`@0 DYM}"IcK!`sʀø-p'U(>u觉jSJDAk _X?E`E 2JX|Ⱥ WfIA\nGE=2h5,;МOxd4@;6 ]q׾Ab\0< SY>ԉ ]ϕ;Fac@)Ns,zNt o0d^&" mleruhCMAr6`=~dLsnM ΍LΑ|,[78B'$ONҡ!9r@\62֎)YPF08ZZ$j׶Bwn=1P s}|0gM '#N%Wl:ǹx\bLrj@A*E6^OτW* EVC:qh[n*]45 U{9VFaY$8d:IAO-q\szHk@p*q48Jrbuwn(+u 4q%$b)`KKk AވJoPLt$`.j2FU%@C>H!nr&Ұ H@Yϛu Y;fOóf}7h.!I':]- [DdۼݨOҢ'm?GZ`&cFaL4Ӟ&bXLZW6ICɠ6c%_DF%ْTh\-rK)AԔ]3eF ɲ]e8Nbd;߃CXwSD/Ѻò@>ubOiډ\N|7o"ueC95KftdwL&Z4ǽT+l~Gs'[_嫯|kn)iMѪ@|Mi ",?wW7OL.'3; ˭YM&.Ǜi8|Memd\NVqX}w&uL4% ̐l9C͝Y4Ի,|uPu!=U>ZQa,61]%onzltM90...K/]B$7OU~]4,1.%YOuF˞I͌ /WL"{:[hx^Gf\qw?+6g&?MGZ QqWgj_~x1}5k6<3JX=/ۛ>}VCc3KVoYwBg[)/m]4sO>0϶l܁dE,Vkkh͆X _tm]ȁp?ک6|1ي<#~{V6ڥ1SeƯ85#15Ny% G ߎx?_?u{ swp!QGLP ]M$2 -pYh"IA\q-akkVX޵&ͻJɊ w>p7ydclYLkʓ?yǔ7:6<3SJF͓(׏q'O+fܩɳQSӡX2}F,0Kz>+柖?is6>9r:lтtHmpd}m *lI!SmA:7k | _HW/z-F\kS20#},ňPo/Fl490HO/z?2ؤs@Uߪ(fB+d0XW¯W/mO#7OﯩA䏓|~n5Fp6@V |i]Q{nΖ3KofOs|7_sucO.xkyU"^,dMfo~xfq: 5̠+Χn<~b)?{7=U1IiK$[lp g'=tPxЙ<1>q(d >Z) 9%NV"/MUfJv#RI䌻1֜+r:=)5 >Rp _;er?@}}InG2FVN CE%cGS/ա۲hgZIh~+hE_O).Ak>_| hby=Tv?tɫzo,|p3zo.*~9,T(; 9X!9;^c6|Nݻ/N~m!= Q'>y^OELUzjzy]J-[W Ngn{2@Vzs :hSևp S&`vTQCn(&AÞSn!dPO?ݭxdjbByE&,|?}Bspq6 ek |ۨ,7?ˍrjP]sLG:J.2=$E':q9zM֙+E7R<,WD|ls\U@wcIj(jӄdjԆ~A_Fa MFm݉&-[h@jQFI[.ɻ)\_O ԡEBn T=T֏lC&akF=jAVq,mM%UHFB k 'yQj[V&wޫEH=ߥ k<}F=7¾AB+4ߛb!ͤ>)?j[6urиf{?w? )a!/`rax JW9(? GU!zY-ǠypFPY^tU ̝h`S4hfRfh⒏6`~h.̍0C\pu8VypRGl6zEeF7 +-8ʁ=^axsLZuLjA|<6AG{tkN -@Z.3`m=`L8E'$ ]Ib鶄,\vV-a>9AGFf“R[lRa$2$u}b˘=4'+|]>̨p`B˚ ̄l^= 8db[YC nr0CC>>d4x4g[+-Pt,0 = I'@FgYj1(HZ@XT 48&LKeKˡe7xiP3z%0w}Ј珍,cES.)VpfQ8>rQH#J&H\Z"t,+wZ tZ@t@eޛ jS{I2x]heE2.*U[Lnrv6w1:6ۍmC,WO"%޵57n#RhWÞu6;L&eˑdgfSOeZ,R$ek\Um(Ak 7T0<5"h22hpt(GSb8}pe |/[%,焧"K<FSr(4Պ,WK :օ{Q ֈGIr"sY$g@ˍh#Oqcnf6FmO/)^_=?eTqTKhşqVdOՈ2N\v㪄cC9 ]HNZy@>u# (SH2nF:> j Z Ǻ S PxH()ah҆2V;0,P9МG;)慓 5\֣#2vjna▁6SJI!2k0AϨݎ^&)ͺK|&oE6<|?zA:^LS]w#2}$}FݫP&Y:&R+ jZ*Z}Aom"򋃐uT@ B晱L T2uLC3tBOՍ4ϿO0)qsSX bI $P!զх>lϱAs ^hifåc)"7KeV崂00 uL, >b#DzYqx]5{0 gyr  ZoX|EP#<𶂀+Z,_imփvc(h,>: iu%#4%7\ цL8ԉ ;eP(WL#)ф&Z67! :j;(5u۳VJ66!"RW q(,5܀1k.֏3,·É_sr!r\hHnqDo*V-Z S#hqe^mh<]x~9+% JVyĨI&<$ʃ\:/7GV!8ޟFDA gb*'o>;///UE^S_NiaۆќǴ!d\Q(HZHpq; ܲ6CU? ON;',C15wG:ڀ8ūnlN RZ\K7;:K&j#lhyAZ8R3)gYq>rc`DW=ZQnKΣ{xkjP T"SLG;, "uRmt)pPiP*.ٵ)1uFT?Xf}_ޜf(I}{^f=/>k8}̆dY>S8D6|lO?ټ*J# 쩳5ۘ+ju<8';o+݂?=sZS'AԞn&3N^tM2(Ukf_'t'h-LCȷ0d_Ĭ9[fvhߥ-HtpCԻP0ῷpa>&@uWp).4OB#@1}_!\[u!WUKZ&Qz+$c`4'6Ӝ .t&e@W%6ϔ(1[F+Z~ L *GIOl'79N`# ogqcj2J#3cCf#>w̋M3cdȵήOrxKmȔL}{%RܶV E ޓ4dLsSM7qgEKO&hGv{? pq=g x*"ݳė{sx֥6yli*9uVk6.n+ߎs}j6Rdxd6PX%z"_Z$n.dvM US tWq'WSϻ6۪ ,!a3]]&(.i~bKPWKug뎹]N_pp  -%QCݗ|5[2=Wr@hR#~hpDetOM_C\쇻Y-Sr l_j괂(liW7ƟyVܧ$+6:_v"Yz)_Ӣ4\8rmwV !Ǵd)}2̐P(+.?3*Dpk!)>(-bGc|:Yaz1Gc]IR!jc=AF&%'wq/ ~bhj6,HzpDiߝԻ&Og/v%I) q)R{׭W?jԆZgv^0פ?8IV)_P?ogwiƓKfa>J__ǝhem"\K Ru!"v|kǓqvS8.%ftۚ 9g,511>sgB1MP*I2c'Wr4BиG+:1Rktg~7M2!?(~~*2S]nN&uvJKl8 sSbXƙTȌR [-d_T+VPM_*zQ,*//@qNHRb EG n4_,:h9.wR&YrkŚ,g,'*h;⮢ '^1rBw!͑2Bs4tӔ@!ɿ|sB,|\x]:IBADtnP$9Cqd1*΃.ՒMztjNN3:/Xӥ|6rlăh-ߣȝ7Gk`St.}Hޠ>.zhMbg\F޳rt.vYu{$>_YR<͟͝S⊲3W%6V^[oʢ_;_G]HORˈEݠ$䅋h*g<=ݤ&AuŠtv#Oڛv<Ѧڭ y"&SL%P8-\h(No'F鹹NGuď[֊򷎗c>\4}6܈"ƴtH@NC[MPV@dB*yi(}= IJ4mz~cpVFGhTbOQY >LmF5Zq&f T3ʙ66y&={jRFqNEwi;p;LBԔwRFT;Ce`$0KP)HĠx@Ќ˥@i=ѝ/ؿՄiu)פ &Mn 2s$ xFvI x{& wjyn*M)VPAlXgv!AW]CcĠ_V <@VE]a&\e:8yaAwd$X!lԥdDCB9ny-cjjдp!ݲ]4Diy}vC`8E/}Gj^j;Blm֍&3lÚ1B4_3~-֌BG@rj 'n4֭GPeL<]_3zD@DkF)GR.k;v7kfoCQ &ڬk(rr60S`~V$9ja(m6ihR j#l5¶ZO )͑޳6q#WXwU74h*_6=*.')däķ~!% #b83i3X84~> ]r- -xE6:K!|tżԠ}7Rʈt7xU; I z*bt/.BYREB]8!cA-$ZZBQ1pLF#Gє !H4!%GcKS%Z5]_.-^Vgdi_3/f$eԜ;ؔB +Kn E\a@\Yt,Aedre m5CD{2x_z!/+{bNe' [30f:Fei1jOgΥ /#ZN#UމhQXɥnؓv)^E43xN^U!A iJ4%8~F'MGDbVY,g1b݆ᣘ|wR*'჆j^̼I̵r}m>f՟!՟l#oL_q"Cz(Tk&T;_1qKjx#y4+>I7RHУR)>jUxc!Rv5%f'Fe۱BWTnlO3԰1$xtqlNnqmhT3k$^WppZ7D s|;xgTi``N (JDgpj(>aVIFMt"blr \fFIWؘCil2N$նh[iJӨ&A}!Qrw LSpD}a2֒ZdC [> }q΢S1̩^N6 wŵc6 2~ BA̿/@pC9(`W{.~ltXX2eB]w^(֊ kAhchJ|EPqis8IVЬֈ'~?*Ԟ,#Yۂ2QTrzE7Z=\6E|,إ?\h-6w<%"FQUk4sI˱@K5ѭ 1$ns JF(eU-hVE 2i:@H" ϷKQuT.F<͟ GEpz4\{4,ހme ̥vTBARe4TER1 g#>o0{Oɟ 4=O?,zAD ~8$!y*;/3 N+j3Ԋ _u\$]"2u]R(#}4Ƹ3:ptBLߥ%N+bP2qׅ:1qwfbInz|V(gʏIC)i"% ]zP](bQ\S#([\ t7[bIieӍyjtz"MjBt$fU)e; wTv?JBBG_$CgXu|U}<߫hȅ ZaTc"L"1ڢwx2GbΒ7i+B'_U;gU {%QڞH&|7BHM`}3;-)9}!jR?_N-j1u'6=Bz̓X⓮z5: UC8$#s`GNMv g'!PIx{[7E0w^qn/x$aY,g$6 =8F%iu w|d:e("DOzBw:r|u %QͷvJ)՞NC1B8aRfVXY@IȺ5J嬹Ry( 98uʐ ڼǽ2.55%*>[;BH%VOv:!{{%W-Rֺ(`ʊljHUR# y\47)m-KESE-|9n. ֝wes#oT{rCTTt-8rY'PcJJЦoc$zPF =5`903 $}w0gF+D386i}/:5>Px%8P]mw5"Hk3{0:~[@wN5{D<oU:aof-.禩:<՜:ڎ.xZ 'EПٕ1xPUBD ,&8XkgW{ ZܞyCs%zͩ R309G #3,Ii{{Wm;JvǴJɕ#_0r~Z=ޓa[:e>鳰ݰջtޞg'mݞ'oȢ+ίiEB_#berిEE{`ձ(,A}^؝g;o^Kgo<#sT;n|\ku\6}.{(cvߝ=֫ԶqtE_ˣ&BE ,,7QiW5Mz=2 0 gۇ#>*$|y,ylAs  /|^;,kL&z~[++Qvq`RF7-+6A22DsE5G|Dm eafi`M*3W9^qU\CAXF OJ-Bi*wmmJsdZ&>[uJuR.hKMQvrRE^p.Z,j8h|nt7058a|;! T rNr31tIػd)x|z.E!x$EaPdF>p6cuȱK55Rؙ>=-m;رhƋמ8"#N|qDpN{{Ly9Tw]~y>?!L?iR,l+S\D[G@ќx[FCUuyiyiyM\a]Myw3 Zw!LiH c:BO¸b!CiW.v_-cy7H]\߂[()HzzziG S3dWH|e蒐oKai&&Jh('ϯ gTyiHW妼{j?Iw^ Aۿ׸Lpm``RxR]@2Ƥzg'`dp~ļɉ79 (MN}=D*dy}gmh%ZNк1u7}Et $3Iw\m .A#:D!\Wh {ON'XObtLדSڥ. 6$VY&yK mEGؼ9'6pMa]K-MZ.!{ t48]xTqF[E)2CQGXN 2'/Fh{ipv!3dq~;oh1 vϬgLNK0<.we8L 4iQrEhY;FTI=}pP #vQ  J.ef^Eɫ4O\ Klzn -Q IYqM5 J3Q pTpj5!w[2ں(s=*.SV$Gdsq7 !(yn JE!Ԑs#JbkpmE=AZ>:aWZI>5{W0'Z0ĢB(_Ǽ˞;/];}-яƍ#pۣWTk 7h]ͼ}6"6_'5v^G}Xd:ؒt<# M/voH~ ]/Lڻ=T۫v.9vgLНkދHx,UbN3c&\+hNCYuPˎ&d,;#ߦh+^t+pwT#REhxvud+i-N-2E"2(גXej?e˘n)u( 9!Z&Hai aC]T%׌krMJӹ6 ~؆gu4t\Ѫ;j$|n9jM N }bwwGlpSxI1uOF-9&Y#mtL :: B1u61uW1D;Xtq_)LYJZRĕxAdBȈEaD xstǁ[#E(VrOIh'S&I 23s֪RRDLJ+pۂNXsGiѢ98F"1MZ26ZⳂPbc6- 9\x'O(@cvZ .g--%Ɛn%$zkecZξ)oj٘>yZ:2$Uo>x``oH "Ƿ:V o烈S}ЮcȐkKc~: tWm=3mޥ uJvV1ʈIyGɈI{h']mhCtJyS^:@ѽ)fJ$ 1 9'wuLl袆erHn2Ad$ i. jrA{W.iMvG70-n!2fFWzαkDrf5)_GhLGGńzt dTBǴdŘ$q=/aќ?3/voc^- Mt3~<ȣXF[ %$@w5 fTl(1˪x }Dzsߌc \inHK¿7)NxEb 1tUP;ؔ `Ɋl  k0"x¤B9/ӽn2x^E;#80|tXSH0BP1ʾ;NGU%_g[Ro4lLG<{nb.E5.]xfgZNx5%גHl r;R-$ ~})F\`ߢ)RFxD!9o'ppvӯY7K2L71Mt~St^-ZMaeϋo" dPU+;@K(q>?!,Kԣ@75?#AD<x^ma>of1GPqxKٔ 4LaPHZE%cs^ʌS&'26'Nנ:%iXbˑҼc5CVc l|[-jJ&As-@qj_?Պfjv_꓀σvȄ& n6wLyC)7[e<дӱX{O?4F/J*G7Y%|Zc|7/PxYל!H/Mݒx@iWdYկj>tI$hH_C4ky*puXEE&+#ӽ3s7m1s v(#>7#Q씝-:e=ZiJ56CI )u]lIx#hLMڌt}F;M^D6c~<}x?^]C cdc_w/qk ;0-Ϣ#z_ω%hh70B ߚM L4B{Deh:RO S 1 >L}6tθFDӵ;E['mtȁni}}n4)q q\D[0f Dz2A4.l0Ud01W'kU|*-KS C'wN54A 9T*E*Er>E`"/I 3RO y ¢R1.nu '4VS7F-P"-6oXU}]%bd^)p.!ň8ò@? Ad]{kQJoɷG4~!(5F ޠZ&R-Il>UeKmkAZ'O89;!mƝpZf9˸˩*TbJ;myŽպa0o*8[ ;4^j(3buf5mP%Z"G7Zv\P"9̖'S^*ɓ98@+w=_vX!i*կA!g N)8 25ҥ 0j҉n;yPuxQ P1/ʦ88z5uiJuZ^/*4<ЌUu䆑gQ"KM> št9zzm#$VU+)J}d h8]l&y.5I<٣ 9YUo/KŃŮs/;~.0¤[Ca,[] 3t 20,Ђ%lЇ\ڲ{0-f?WӏF a^Dhb)a;11!UZNWIw~Υ,rp O^wֹٟÔe_*Yûn}otk:ũRVÈm>_Y=5d`vJʧAitHdjNǫ͏'nߚ~>nC?c=oCFP&P{[z |Vp jq#ZR68_5IQC 3C p"kfyާ\o8; 2HQrnZQ+5\W ՛_hZ m4U3 16/2¸*Ei;]-|xmkPC w[ $U327[w)aWTwZZuc\lu"SB5'YB`Fo5<"&a4k>=/99_U։pt?~g;]hn,n-묲':gZaVF?9x*2*(QmFw6Xs;M҆oZ({}\=5 шZlmiDD-z<1XF<9skdB(| ܛaOu8 ~1J'3)FAhݧl45h}ǾUD>{[oNb~u0Ars\.c¹"*[x8Nk}dFEӿ}^<~//:Y/Fo_OgI|| YX0O6N^'@e@g_Mspa0J{{{ҁ{|/`:ž wo5M2L4Cy-D䔔fJI݅T  Q"c1mN 9WT5T"JEsA>\zrM ;jxE kA2xʩ3&%uR{L$3-HYCJڧg!T%ɯ\AM`BBp '`BeD:j& ci5J!LOg5bTje%vXcH+ije{#9.-"q9g*yB*)}F:Xx@0'%j(LTHC3qq0=Q{@T$i]`XC;(29n%_=U΃Sׁz2Hf5F haG?ʚcpǧ704ƊJx\z 0KEw`yQ+%?o x2-8ޡšo cNSaY֌'׾; ,B~O)8:~s?? ԱuW{\Lj?:Gw7q"_nrRTn]v,I׷J}bo7թwu qJZ@qG-jqJP;30_*pga̛?2'oPo aZv Vk P* ns,XTD!<,H"Ry4t Pl{Oaooz_ig`xmMjz)`a4`ô5ULfPjٍ(Y=Ck鼼fkjZcV֏h,Q&3<x FIa7YQH2|mm(K?ryX nW"Al\EQHg'9쯕ݹWRsLG\B2'cx?`2v1KJɗ`<%Y$֟Z]}4p/香AG" M;< 5Yه<"'7f0Ό6bЩ2)%U:0^D'D"҂e$5Snolu`mG1Hr"@1ӌ3cl@Q#uж,nm =Lt]4}J?OOɬ|{ Ar8OYT#F Q,ҦZ8op c^ɹvl~ a?c?r;IAp7_lǽooϽ7=a QsY]B}8'y4M b[X);u=UԕYBT8]ȥ$!cZEF vaj՟@>[z9SgYw%GXdlhv Ƨܟ p0e`器XYR8hHw4gXʖ5) MLf4IIJ&e4I0%?eik5⎓! 1PD"D)!#V4G8cz W' p%12Z,C4̑QCC뗐V֯ʽUʔi2i ~p`<ҫ@AGoqo1yh);dRA/s+.i# (U.|?(R 0BfW(t)v `k.4+&x*D𦽠B3QfNn~x"JcI.O4HaHheVRʦK.ԲsPgT9)Ai89Ŏ;9݆7Vyb֙+-% Gž")X<ĬhPL`^Yx8^q=,S%5Vo5.WkR85p,(06`gdܨ4P+p⢴ \=Jk.5c>)h Vr`2Q˝B`Q@"1]#IO#oa)թK[m]g"#27H2H/o3I/ u2|";K()e˕hj=2voVmи@[Tl{6ksլ>2\֐f2/m")^i퐙}L;d9pJ`ȷ(W m2Qն2p R3qaPYpɤqʉcܟޣ|&p<7b+e tYbc k/D\Ǜ~* lYxa;X~Ih#ȯAz/`Ɇ|cR:~JSGH]Gw{ o߿Ua~SpץJ Ws"2JLhҿpJ}fw\L%LYd2vBhԫ]ҝiYQfZ*;Fя5JYkEeڽ k()"%AR4ʋ~x|.C9[3=ӞTKAv\R;ӠQhdϽiƤX.hږK.;D`;66BK{Jc!UI.#NA >^zCMgy.jιE.g +TrJDh+ɑ &({$xsR#xsH1i?.&=)ua2VLpنԛTů/z,l[K[vJ ;Vű]kvyv U 1wK}'F8b -9+'=op*+\[N5`OS{[bi]ce2ʬ6S b  P(;L̺;W#R m;MC|Eۖ F!Os΂ g 0*4r\].:g]^#%cokfADBc]Z>|4iD{?w,69p˵z\k)!wyGLMpeD&3Rbp-AG>Dt8W9P:c8:fQV5`V>]~2@2K+'{-8'g'v'd[ɒ%Ofk E[ QK"h974Lѭ`+ѩOt\du[$vGgӡv5P{WOYĨR`܆>b֢DU|_W\aN[l;p%1{d5iQ!D#&tw "nE2ZJ^~x%v(CoP_ iV,Oţ]rb]hĘG!9A*0F\X,S#%WdJP@Dl_oN鬫Agӑz&s"AC*V6mI뚥r uMy!43+p i'%?fZqkFGfțuw%h>\V#KQ1v}I̘aPSP+C?$IB/lywapwf[dR(R"YYUŒAݶdUEDF c@:!T8lӤx | id ]}ym쐄wr}+=XEuyȔB>c}jRsy.NZΞj#Γ$왦'?8q%.1Rѱ6K DzW~v wwqx>^ٗ?OΞKBpEX^&Kn= ';6"5!ЀᰭTq8l["(ۄBMRvgݽn4;F<>c_[ c_@1BYVZbb'5H7 ~wM%aoQ~4p޳NUn /QRT8@;,JQ ׈;N $nȳB`8r3y&'58 Rp`AB1f wJ%*cy3Is!K!ZnEM?QKnq/y|y#45wB w@<,I$aX׫<h:FaGhL ߹v;Ec̲ͥ>K m8Wp)S>  #oi|򖧝ͤ#>F y(;9)LyIGM!86ͯ$+)MwrS' B&P ,:ka[0)C>{nV ]1%KdFWW8B2z:Uݞ8 m^nQ-(SӎdFW_]$y%H&\ʁhb)IPW,EiHbF)Gô!(1 yp{Q:PEF z-=ZG@N믾uBuu,^!\qa 10;dRET '5!GXvhnctѹ&xoPH#_f L0F@y"Xt+8o NQj#:/nxP';:@xGJH*gZ #J$ryk.wrMwϦ϶TNEV9=9.Pifp , 3 vGsL;sb.@F"јȼ% ^DGqo1 Ԁd"# ">t,{tYI/@dêU/gӒI1iۼNHF!`$ 0pwsZjacXF ˴rP=+c'ۆ0[̓-I|u0"0X v s`Rq7"Y&G Ӏqk5:K}רWf8CAg*听N%r&0̔%F# pzdqЄFJZx ܲ#eJܫd.gp%BpJe 7f8 jua\:$)(WHI+/*81G4eì c k/m0i`28*bʑ鶴!kAzQo͇e幙~lʵ0\W_չT2l įy+vI >y;/0Y1ɪ(RwMo " iM^Lt_stqwB<ӷ'7W\2`]H$\OgŧcRrx$o^LnY=RLjjݧwCgNw~+~Рޮ8.(Bmg{wb2hkP98 3hhpvJK6̳WH?!$2{DUh^Q:aNhNI6 !HX好[揧ZV1dYIQIfI7ʶՌjO,:%8QCKyҕcոV(eG|ce3':tk_>}V̻CC}#n v]|\|&Y+)FWl25$mn254>_}hq'D7R L;Hݞ3]%jD$tʷ[Go$ʷyIS2NYg,Of(OÓzdF2jwufp 8!~+*?CzE,DoA1fltz=ݥ6[OwI.U^FV3x;[2փ;0q FF/2OoʹLj\-\Us ŗItpah ﭴGQ;驛chUS tFfjX1~*YF[_Ixǫ'Jc2Q 03/c}kOCͬOjfE2RYN~zwI.YUj69<'QkR~42XѴ'VnEjvهA!rݺ?~}QmĻX>5~?$,+Oq n>Gר>A[9{ZTtWnɆV22WR^L0*:)T5}VnB1y%wjuZ_hLꅔ8)p+X%o>Mnw.Uup⧍oͮ.<}~rIaoh=x%dc4fQ+EI4VJJ#{p%OxDA Qw}TL5< : 91{A#!EUKG\ 0 SbBCP` Ǚ 6P(AOL_]`D=1.Zje2 mٻFnWTzg\\5k+N$(J8hQZ)v{˵ck 8OS3CܩGJLF5D< "72.jaD<6:qE@R)+p( 0Aqwʭgq4gQY|N'MΜՆrEFQ &VYaRԼ ;LPvqk)qk)Y>h8GUl{opcӝ$wϐkሪU{DUEiuwo_\{߮?}۷1-p۷1éob4hQ* vGd+]SQjuM풴&Vc 8@ЗXU:VFA=vj Zni - i?E'qEA㛛9?:?V߳lqq#Y vLYj*.Z+B#{b :J4ꉟ] = 3SjVǗ/q.Tʄ) IC}47 h/.V˨[_U>>юeQo-/_nCҕmu8rNݨ!6JV^^ p&QZBr Ntv/{hF֒QVe,ǀz,g-4V1ץœq$3:BkJ[by^.r7ejM|E\+Ah̞.>Z2FyOd* ns7#Pd"f\iEdpKK7Zc÷VJ7L(syP;Tʕϣ9;3ɣˈeHy4bx6>X;;1\4yTml)Qvn4h"!3ڄܝגM"vVV;WgN}0;ƃY hI0T8 5>m++WW_ٜDZҏa{vVc;7ǶNK_J$rIM>u2۠7: .\4fOâFLUPQg)d $mW_l̼\m$owR``sT4#HjQ%\QB(eZ9Eͨ`M=E՞mO)fn13m-͜<=,h;`Np'|՚ @Ml;ڶK䤼b+9;1HPDs`a_<~*}\^,WY2իI<~pgPسT#̬Edkİ5Av6O/eГFeR>1٦PeO^&l1BrCD zן:n)y<L r%Sƽ7Ͽt3QkȩM9P<6.2gPCd-'-GntC"P!h[77 :LǿܒAG2XolbrpWB`f4Қ >&aה! k̙{~ILt 8<$Ź]~6mY-wl&/3[m$}R%9ѭy&̾~!$h^1"Pf؁>JPy> X㍥ 1&xƩnj)Qa1g;qGΛqoP4 "-?YZI QhIDvNkj%] JkY] *IʀVY/Vl;{ B)%y ݸ^vq;#8㼩"gsZc Y[}sb';e!Ԫ=KݰlfTlg@ĸ@2A.b눧)жĮ!5/ {QKBo6v͹ZbK]el-qo|.2cT(1θC^Tf3۸1RTNowJ^v+zMc1l=RB\|iXK_/gBO$fvcp]1h~eWmn;Q!.YL9E2's]6V< 馃ƒ@K!W׋VNsm8eT3`3̌"\=;؞](7Fa a*;L:3KѻYc݉aFHn[~ >?)m*lHIwL5R_Rzt%O#C;9>۪!mj_w"ehv=WQwc*t'f J4>zrlSl3%aCwhGi-G]j޷jQw&s7]`p ֛ͺ/3bGt*2fnƁe>@}~Dף?{x #Hu؀C=/,)&N"!ύ5`uv[/I~;7 Xw6^ithk 4@W8㨃S*:򥦢\6lp~ ZY93ID[$Lˏ:B;=Ttlftl9蘤ؖ-??E)^l*L.!-2!aMjFbSpPz\jp+kvpXGcDs)K1U Ԣ~baW(jARZ^b%S mJ , -E 4mOjHAKp0]]G[{~O(KrP }Ivx`PqhuE.+mlF`zAu>ݷx",esɱ;5Nq#2- .0yd2"P9=__"[lw:SC1jcF]1𜒀Ā@c<)LM<s0,jnq @7HYpt[)~st} :pw"wʉh*zPimB?""Y,Ba & t7Zfv9p6OQvxyuҸLȶ2ɥXˢ6SUi6b&Mǟ>V:ufw,x=_e[\ 3QzRKH rّFKy3כ4/n,4ss v? WGi{q%E錄pgʅL/l^K`ZHe+Lz"ķɰj٧m()U&6[4%d\ _CvU.+iY6RR(3'w#n#/hk:r5oP3}K:*;s#"-e32fFs`m3<}=&Dv"¶ja&S-tڪ2]VxB--3<@ΈlOaDDʣSx>Xjf譱Jh&$J-lH^tE} q5EgfT5Tc_jMJ5*׫OҐ}׀ċ#ɬ%$8z+p G/ۏ?_?~ϏG\;Vߏ_9lvy9[~͛7/`QeN(_q^sTsv>oo׼_˷տ[Fw:AB-!ӄz(RȹߗDF!r`IQ1XN3{M祔x@{%sDi\! aht<:b^hn#x?}0[̖~ _?;hDgZVTS;.2`£)iȓSl$XMҒ\dx\R"BФbL)2' ")r4%[xs^dș}4hNFf{Й `Yn}5M$Zbi5H\6g(DR%O͕چ[wq9>ڨW7LgGQj'lPyȕ7\>]^Ч8eOC4?/o}r3X0>d.sO{8r?$wvxIs MeoׁN ܥ5hi\9 ShvϿPeɲ"{ؗT~~N%k%v/K}t;w߫Kq ,Iڭ ^pdccL  2HKڨUk G GwN_z? C#%13 &١Ⱦ8LѓUtMH2X+s&( n2,qc8)> @ Vz :[EXĒ4*wg;۝e)Tޛc8ځNT>D4yu|{~:vNR4h'DGj>=3GVo}+$2ٻϏjWKG?^\%jp4Q~}~|{Dh<U4} c'տ\۩;7S~ hhƟ=:3ܢP%« {O8οT\L W 03 @\ Ts`sJ$f-evd^&AYDKAѶ =ErXApO>7!kjRvXɪTYFm $_XF^9s4ll.QeKUUFvQA.:H9֪J2ҎeF!HS1 ^Zi-Yg) ˴܌EFZb%f n 0HŐ,ІIN`/\MLl$oN뿘ӆ,,#%(U×tztw&A07oKp Qe,VBO׋?w磓̣M UE1& Ǜ4*b4O㩮/呲XdYO9|N?,*V]vM'?}M+Gیh'v?ڍ+aAuŠtv"O穁im6!!"L!C혝O&X&P'+2ZPT'`]ͤ&qcIMUKk|_X Ӱwɩ] {fў́Mp#pg3T|k5 0^2B.^ 6N<=m>mpӤ JnQ Vl1i4c{\t"F >)jٮNQZOQ-aǼ闻aNg aZ%lYNbi$ŸЭ|-cہ#C ;ˠ^W84,eM׾W8zKK'uc X"XLKPDTӼjz!dk.bAgL 4|>sCə, FXfe^ɃZcU)ɥ Z$%%J hCc2OV/Z0jIE]E=<<~ьCuьN(TfUw1Yv2 R1qX+xaIŹnY\|c*>~9T?O-%f4:^џr}ZO}=m7oQ`cm@gQYs]&NC5g!,ʇp,ќY-'D SѪlO2#1ʒݧ&~(qIԂ,{c96Τ"Y,)G.XxvT /K!e;H*VS=HڋB#֨% ŵZd=+| ^yc9/ (o+;8.z!Av4-9|*yQewz{\$)s郃xq׶Zi {KM5f:vk7QrZMI_&fdQ)<!'ڟi#.D%Ĩ˥Z2G2W-H-u1# >d{ۗB Z6I@;\\!g @ %Bi=U }<2%4d2儥T9d i ʥ3Ss*`P˻w퐔IQJDh_‡Z)nZMtBͳJlo]+^|t~&~>}LF_KfNȟgխ+Z֧̅ q7`Wl.w_;LOP⛻s)5˽bSإ~OC&e h̗r9"bn<R{w;I%aDћI!LJ沟֒]2%Z \Sȿc撾yeZs>3|yȁEZ2rL  &UAۑ$̤6A3S9XfHnjuxdYF-*٨8%z<qR[dT0F5:XV}ҭQltLq1~`b9W8c=qUQk5j6|wA,^59"dy"/%߂NzuV+2[U5܏?ũn{B;U󼈸A|o a^5*->4ْ̫Zz&7 r|.4$0LakȽi"W**^BRk8PDg(H>Ʌɂ:ɐ"r F/H$@KAkZtY= | bRFWiK EXxRshui$QI0lw36831..̂}>:H(AJǤg֑7hRd7@2Z_>%=a?gGa T;P^?]^Ч8eOq^6y㼾yL.n!X:C4B%eMVh=19iY)t T<\dpVVUqŴ t2VL.{Οxi}VϽP-@Jgv̖hRD7,˖h1y[{5G4{7){ڗʪH-fP:ŨR9 >5SMyZXRRчwQ%' |s! h>H\#4g# ᄐ:bY?yi3ۇjjuAZ#<ҿǷq ֝xSX#3a12%QH\vRP0$](p_侃R ܮB l:T%#R,τw.%YWdp1h?HcPi~ZSl>=Zqllk8.xPogt2Ŷ(= "!8:dHjR>۴GT!/CTQh8c  Gj;F-7 zTM;JȆ%:LTl$DLV܈@3樃 6P*t8fq"(9oV/j&3sR5SY dthƶC7,uBb7PWqF1 ,ַ/ Y+V! kF?cŽ|3\2ý|w=>O%v5y.ma |W#tEs#BN}Yi(r{]ߺw񪞞;ifO4zq ujQʹUQflk{\p6|23>uj=AyAmYmՊIn6v^ځSoeFPmݪ^5Z64[|Ҋo\nܺnv.vc@%vlͳd6B "(0deyKSp<1kmYEЇ; SÀ?dcO0bp'kMDQ$7ݤdQJM փ"鮮RPA`4a@PYaʅ^˾9GfH4?)f#ţz crTeGN#ƮCs#8CV@zR-qa{#3OZk8&;R#nD/ײhLE敜'ŃKZ 1Ɨ2jJ ۪Ǫ+FN8b>P;kĎJ­+8Ngc DoK]#7>/&ڄXnɞ0$7g7qM%gr^@΄ht 25mU ;D/r~^W)B1*^⻡h#yX/ﰩMIAݲŘ2SmZ) L[lM60ϼA_VVn޻(z`tX+p[PP׃MaQi[~km޽H}u]lqh`֥C+1 u[)gY?0Qj- qdGdt4H?3Q9RzBAxuPP =Xc޾iRdO{m$\c Cs8u](OXu[ ^^m[ugr&~Z%FS!y dy,],YgAu ť1ISSjlY/a2sdu>,6BH%xi`!=0[6zuFu~3 `[}wVSldvwŃwu%# //յbԟ (i:sQؘ37Wc.v] U=LR&?~.ߙ_ LdX%iYkk%̒>6.-2=r(h}0z@HdЙg7s#}t¸bN*hNhAhK|t"߫l''L9? cGb=W\O2|,օ? [?&\s !,>\pe/;ZY6XwϒSŏN$'|yLST&KJ!ϼS!ڍ7݊b`u>#vkڴ[j2z3UNL{Tmx /tک>WGk' (]ߧ=kX.^;[LۧpUk- mbk$c&i[.-2M! ]ቈʒǚ YR.ThOA$خDt5˯Z }'<H_9J% +ńJ]l}́9.JrWڽ/ N- E!9MDA(4}RL kbxR)ڌ[7F'[_wJ(J,*m:o:/y:LtV;[{;EU o?7]?O;LQ0uf3k^HC~Ϸ4kNi+\v pNMO>bΩʹe$f[=4΅\%W?җ>`9DOf߽ $dSVJ8H)gep8؃1HnD8J^ľƝd|6{) ;P}meѝΊAOLȿIƥgd`. Qd"$#gy1М"!mFBroOS ]! ` |oR:$ipv(JO8RG:+&RIC:#MlacIFLԎ"Y^C]ػԷ0ah! BڅgI=0h5ʄ̵H Es-'K\te <:ja4)#콏7c#G:KqA{d*Α2($0] *[>jSDl{4M%{lSg. jYmPσ WjA "rpu)R.$}˗ЭAI?n&ɋ$ϓ>Bp;#R GERZ*ͳl5 *7Du[s8|(K:zDpg pYnIqasw.ʲ48 jlpÃKjJ~Ji}EER09L1ٕ;d}|jvUxSxReR;1ād5j @C4-g_[`52`Aۺ2LipZꝆZyu[TƓլs<( oKϴ}4oInc sjӕ; VɷiG޴!prdBTEkyղUE(RT'dUCV5xs(/'ON1 [qU8u|U(HyFCmyw`Xt oud; ĭgD9YI)t}08ޞOa;rp?A;QQ:(3]4{E),ͮTѹAS93:5mY&W) q%08s)\ʮYTqƸhj\ZWjidPhwV}vHy#_vǁJZW>s-+']s^0I)@sǢ$n{Inr3}Y1$hs)/53~xl}Ee,u3D!ܿ[2 Lʶ_];E*2 RU25i=4TuCni6݆ P,)d=@0kJU[~lKbOF~U9IMfG1ąQhԟQ%8]VChhT#._WD1lj_ X@6W?_tG=f[50.t^Mg]~ҽõ{e !C#<ψ6t\!E\OPABJ)hB1Ъm/~Ӧo_XfWrM|+#ZG}xo(57N (ޡ㠣}uzL?.d u*Jawo~*ODn)" %"_Mz]>S)jS XzYvRBH,+NDu%]@aFо8(]Os}Fn./4EB9 ) BR2)8`q>%b DC K̂`%Wz;UAG?%rd QSp7t} #CDnɤnu38?wHJrBG%w9(S ~{0c)p7iMܴK֐H-cW&US?GAP!֟e!$ X+V- zNGI-`Crwe)C&1<BtȄO~r: 4ZSC0fQiqH.{H>w2= *sM,&=m׳\J&;J\sZ$Bɺ† lpaHaP.#*X"@MlBKXN* -Z!ZrH FwظZnf]S/`.^8;+”bQ\v^3FJWn3Zo0&~a9a_k W_8wT0"yC?.l2;Au9U"nю~4һӓG<Ϲ;ѽ(g/:\\sύ?EvT|M#o6 #Q租21ٿF/f4~~jgG;17o_~q?o~x݋xuݿ.Y||/5p_=>{$z Xz|'xO߽GF|sIm~rL{0I4n d.H~&r<}̸~{۸lwwvCY,`&{;~-Eo5Iɔ,ɤIQ0@,jVjv~M0//ྋ}Lve2o/l1$_@bf aa]UY9@?ydކ %垢yٹۿ2Rh` n-6UWX .Vj $Ʊ0[9+><6TYA_6CSmɟɲW_ŻK0 n'Ԙ w (FÁtDS;̲S˯V."f00CG4c7ǗWw_O H]~ݾf F4|:z8}X5zf?'q!fg~;0-K}q_OnF?֗!N}K?_0ff6iǿ^gLmkt{Wc{?ó=eO-l΂⾬#E8GKr/b昤rPT%:((<+.cĶ^!,&<(l\N_}0Or"1\^(7}7SAn,SHb2ip9+Л7ɮS *|zz+AoƺW~yP yrmuR˫2jdޫ'4v6F."޾{#pox\Q[5+a&I-#Ax.btHED9j3*!YzEYQkԱZxm](f ǯ]]yez Fp h(ȉD*AAB HOf\{w!\lɉ;sJ.xهw.En *&D2G>OmXZuS6XCqJo+bS( Jjkc=MU˜MoT |w;D=? ?<55qՈڀ"|hD#D6 0oP hHz0#_F+iJk5"w^έ3IO*sVRƒR q棜TwYԲGԫ64N`ޝNʯrG !m.:%ѮֿNۜ% num>En5A:r_PF /jg4n[K*j֗d#F f(l\3B:^)?/[4\iV/d:xGz2g2 7KLPo޼i)6-נ:*^,NZČU pGX ΍ٱ?;sw7~;I?1_5g]4Wnjd@;!Nd@O!鏯_}7Boj1]& {(OAXVgdxo^G!rU .ÉK3B[LtrAu&ja.ULw^b}7Kd~Z kzEuQ叫X2'x| =M s$Ngܰ> NO.-`(Ϥi1 ! 8ҩ  XtScagBfǔKełbnf]^g}z Xw0Yܜy,{fovW}\W-OH]R)kTSضe4Jɫ_Q4ꁬ )X}qȐk;NJ ?I2LĞDuuEA%vo ,]{U*ՊDU*\Q.F[UjhTQ-ƈ(ϵ9錅kf FS&< TI4IXJŒu(ލO!/yR#/9E-DjB1q۳NMv8&1qLb1)'kLuQ{aMS:V{ _^xmdŚ)Au^Qyٳ *@y]\}U2Ãקɱ(Xs/w_b _$NB/N~_:]JJ۬PS-X)2@EAieiv!CU{dtWWh=Hm_COẐ;K1'i".͏] ;3Ám(\u@gEx8~n i,J :` ^z6-CWzYZ [ߴ5A'XճJT`Vq?yؗKNv}1=YI >N}pzDHtqAP0ԉ@9+)҆ qVHǴvU `kiBw8j %Bv_!X<8wUg :)@ʃ"^b"LəgB  ܗKp4X1%dH]s)ZT^Ap+~1qҒO 0[45/ׁYQh f5ëDܐb0r7&؋ƞ-5Ng2\l^~zVvq0\f]QN}.઼qdoء$ٷ\XûmʅyG~n6=̎{ x)bԋ4pYu~Y\"cLτ}- !5L2v)H4!B2RfjGP9p |xn I4$BM!)BM[=a\>F8.97 fČF Zu5D䜐Ҥ\>WA1]4n,`'6<53jӞ:O̹ZK_dVC:_˷Z|z|2wx@⺋أ8W? Y׏ 7.qmSƳv蘺(FqW` (NK B~V[r&%v6|u z%|=;ѭ ݺn=~JvvvvRUk:G!Sы)Vc`*PT¼IjO8uMG<Վ0-ǘ(3#+aPw+Ͻu p NneI\OxǓxHC`NANX y# 2#,JCJSL:O͐T-rM! jqb0s; n !)3)`# s@݁3L"BdjNr=Q +h4E@'1C X?r{On!.NY'=wot ߿:X`a,"W̿?*,|?ŋϾyp33HIE XPO9߰s$XX{7s5/S6 )LƙRd=ٹ;Ҋ qB~ mŎi/3ztdbFk_W6ԗ{A%+w曃B|U<ߨMgšˋ LF=t/ڧn;e[p_?`Dri\y̆/?& R[$6ȶqC:ƵT' Qb3EɃk[~l eY r*{WF22+/R s\ )qgwt! pJ!iīԦ)1leiɩ'5KD4!l R~rRDCNZ!1؟% 󔛠њ-j[vAVjR%EtqgLwRόԞQ)k 7C%2W¥??o/&޳!-C`$;4AC:G31 .e[! R٠%b v)PP&v 2.矣&(iqzZclL+rQ >rM9~㤃A 2`NZj8 cC:gӠb> >[toL%N;]R!Tj֕Lo7Wnb ,FUͅR̬#)'DAq_ &U!%.U`of(y㔒 qHq*)u e#_(eae8R"G=8FDtܩkhoґ(! 'sXaNXcuRL !nXvti9V,@n-?LJ9tCeGxTE+A,k 4+K7uEC8GjqP#nM/ (?t˴~͖%K ͸ߌGNzxz2]~2ZIh83!n-femeSp~B1gW6ZLsۖ.^ ָCٕF '% -rN9Ħ;:db9*W^>{~֜&}P!K1s% =Zא5btL,'!b5dwڹ4$7`z! ~[0{;%z:[ݭr @=g ݳP/ TfBs^;Xttk$t,n F V^חnrD J>MLg6U4$XKTM,i9eo] S55Dpy┴PA`ìkIy5؀lk>g+\.?.WWDkUU꼜,sqEY`'ĆLoђS-yyP3SjS"s:Ԉ:򌛧s#"*$ޘAUAԤ^(Zg͹6F4p#2#xCsNRlZ tbOMFhUծHn¯7۝!e>A ܴO%+G#fᓍK^#W^6Fޑҫ Z0CAt^:E,O?8\;z~ƗeKTl|Ͱ_ !҇kEp%ya{ZxuОVN[kF">l0d_S8hb%s(~yw^4 iyx UKOxyF̓R޼H ɚЇvbg[VQ8hSn~Y-M;_Z<?NIFBKBn'XKL9hH[@in6+g=.V_qq˲,aL$5X$P}:PHE!E;Fˆ̗UQ )'Xv+ӡ屯:|paC5< JQ>\,i/NS><Hn^0*r/)'TeiEC=Xpڡ~>wogQ2H2xC0Vj#df% r"2tk7Hv5&yd梣\(ޠrqЋ\KɂCȕl+ 5ݧo`6 \,,۽@3_)} _͢٣j`/'TJ\_]ד8pv,eNם`o~jl<5&kd42rsڜ1ctD뙲TXG84IoP9Z:ˌuzTZ;NQn 5 \]*T'khKp<5a $qHPR8iJS 4&HEfZH ~çP:z|wٱ33.VGRaD[??<-K*%^fZ}yؕ0  择'w{&’63 㻫+V||L~#ƨ8\z?SŘq;_n'|@ƒP \05dJݯPi:Ź‹XH8D|tk}I͇IWNP^QOWr*үSb,ZʵNZAd^޻HhDv *!^$EB(ӊBE"2yfF(0W]$kx&Eg(u &DmOp;4ER.;lR~:-ki ]E@13~-(t*4ҤY+\m4(IfM, ܋w/| e|>u #8f(bOtd(aGGCÅ.WH(.8 tx(^D /]":]02OKՓo۫Ujqƥ7~}NKrӧB`ZS2P5Ĉniw_]H(@о6I P0)DAG\G u@oMKBM?O/5C%^^x+ֳ2s <,PݷZ ')8mmF^ }ւ@h֜zX6(vZ;:N"o qy 'Rz%zB璓-gljM^;=lI esh_w+cIIoAZif-8ņAQ^-!)1ߨ"@3/@ swBFC2Ršrj!ELR`T ;l&)hK(4帞STXrTNq) Ÿ-x4e4O&YX (6@r$;% D1gݓ'5 GF/ H 4mݱĽ@?&sEzroy|>U'<P8ͨm5> psgr%[<_*Y2 &-϶ f5=uīaT'1p݉BVdeG'u!*l]r|3]!z\76>>]O/cJPh9w Gਖru9ZϲsC\,/a>{>zXs;)wN!=r }9Xpu5iݶlՎGoJg2wޖjם+sϡ1iӾˬa3mg`̛1ozfM< BF-GZ+];[C7 GqP3rп8}Kh,)B8yMQ:AOh4]>JEPIvy)F'M2i8 ס=cZH'v(Q T9O  k&%Qy%8[Da/!{ww٭n:tdD)J~uC*b)^r-l`>W1c&%.ҜJ9ESU\zύ?|~ׇ"Ʉ1At`'Ar/`au~pVGdV5WwWo>7o›GmoVϾ:C^o0k?MCה)_9v>XGܘPAӜT\QA>KUe"2LQQqTm.$QUL 5]e.y* 3Q6*\z\\tpyl&Y*LP%iI-f5$`r5() ۸ iZ/\cM4=k!' iF@U!3 wqm$G!!jP%qa3B@vCPN! 5NbZqɒ[*\&{nZI`) e<ʂbɜ׊rc"BZVT !#$Mpw%e$-!1\@O9z'N`ԅrqV@.{jja6¸VFٻq,WTz%Uy芳S];[LOlE.$;L_@lJ"EP$(JC_熃<RFl[?~a̿gbD #SNbU*&Ffx^h羾X Lֲ^Pj-="(xD͙Z\So'J{=\X͗#ġ|ek8x]~u2h8 ..ځҳG['2G%H@ULUbK j5P}K2*\ ,Pf4MhǖCNE#;G43 WzFrpZ"j8ĺZkNPX(ʼn+O˅ p۩m7k/vLe^,s ;ܲKRS 0E:@J!Y)JDj1hf'rl) jLEdt#2xRBˉiàhmK dVa.:F` ϲ[Z=7I c[e_$@WHMܡ=AY $d@jl`Z7*lP+rykݍ1d%BcXY/% K-n`pE}ob9R@JbHS[T?|̕}=Bh~kXwtu3})Υť6tj4VlLejM KʱAH(`(Ht*]0`(4McZYcjL' 4RD >&%T2sN Y^{)JqkC!]Kil Dn(٫QWH7R ℓ Īt?,:_ǡØ_] ad76Ib+WJEޯ1(9쾏+;ƭP޴kxs_*vjsAzu1觊J?նT S%HPhw!@ h[.wvTX=<j/P1m}ց_Nc=;fsk!EˉJm2>uX{r"SYf4ogk7yuջaI&w1bs31)];%ot )>4cqPbYz,az|dӏcVV7%9вucs΁uߍZ/u43>ܤcјσD`apщu_էUgI҈E:i6$(/qbE*oߎZm&m]QUYiBhtq$EBIB82KH[7(`sis泱_کv+#`G4,b,vWJ0HH$$JH%3}S'^25>ʽh<ܞdye i<]1c&.-1~!IMr9p eYKIP O?7ob]& yq[DȦٮW}w"i|L-j(/.oNRQDL` Lvmt~fOz|TbK0t['i[Eq'aW/ K*Y4K%~V8Y7.D PPe_S] {i _]k4fد>og_aتtqr cǫ>t )aJCGkfg3 _%RF VVB@9+~; 3g\ ize{]]V$D(%J*2_GJ$1k*W Ö9T;.*u[wtG.Qu" 6%Od4ĘmGԤL !CxvFKSD5M| 1>%&LzEN QSw/~ q?whGֳ'o?vاr443]!rJ?Oۋo?Qt"/fRCe)TȀBԜ891YWv}3 h# ')uu"V0>tĂ@Qkzk 0!["b=C[:@1#0&JF2bv}+KLJ/f!B>RYz.C; xKy>ϊpڤ)vpȦ: BA즾U*Z0c2(!]:#+`8C?" q!~Ө=/wS.3(JoTʿM'eBG˅DQaXD28 9ҮWn֪oevvV7p5-2 GZ[Du4R*#nŋ2h1YY|id{&`H:cnoNy]NCI07s U.+k,U4՝]H|7[,xy;  u7 9E2rù~zg2C-#{^0E(u _σ-a4?e?ި51;^js?_ ~Nwv4]eÿ֯H3?'g;bR'|̕E\m>|@f 5RtqgTfT[ZsRd=ۺu1creR'Xgm1t9O8B%YZiSY+ Jƺ~X??:o92N!Nr\$iq;‰c&w'_p0B3Zpmjɓ-81E9Yoe6LgzV1qxygz/&|픞EPhH$(&!1H19iEPp iJ$O!_A*7d@rKT~P(jt7zE5/[_ﻢb Ni|kVN)L~glY[5v91/, vY}rrd>md'HSY`3oMeS?>~~43|d)y÷~уݬO`u=rᵋ2J%2A((Y0Z!"8IBe5v ]2-7τh&c{ߪ:0M/xzj&DfZמȥ:y{R\^3?K^u]EPk^KE6r^2uA5h%Kda Ҙ3b@"TD@dw!dž2*(,`#6:=ZDp6{ ti > ЌeӜ`\-s㮖qWܸenWd 6><4cݮPJ42FZ$Bj& IƐc e!PIuٌάމh 6YVve8P7yLӄk&Gc-7w?٢x%oNw}RC!efb& \{kǃ@?XEXnO9b2% 7ҖWqH9h߳`Jr8yd5n1j -A\Ct&X"tlԓ%jσ.?i<«O` &Y₯ы7>zבNBB BrKY3eMFWRCpNT*֑IDJc-GVĢxcߦl?"{z޸޼77M7o֏P8- өW ʬaMcc eSSJIjmT5)ۏZ~Dݏ?]y Q;4Rbkc#"\2-iK [~hj8/Rο-ٻïjyq}p|(;]*򏆉)ƹE]LsSpX5glx1)u`fσI%TK| < ˳JuF@€vV!]PlF^wc!mПnr#*pWM'ӛs&C{`(mP:c-ݪX-zǞ4Z'ʨ4JFa)n )ndys>! }U>UY;!?/_]y-o՚‡ GTËCpӡ$ ڵ_iǑQUw!"s >Y͋}!k>\Q $Wppӳ-bo@Bl`)&FubD;5DM3ۺ%;e&$i$  4ڲ*$L)E#~U:n=ُcJ~uЯEJאF@Hx҃29 `ٕ,k#SX*q|Ǭ[A9`z0xiVJVe;pﻡkUb1\{X}y&4r t"X

SBNBCE;%< SzϤoOl:[2Ldis$dۚ 8\blp0eݥ99I52J%5npwrN<_5/^n2L=fY{7JbItDQ Nony sSQ4{A"KF"b b4J2xqb0pi@1 %ip=Aàp-ekmHAf_! ڰK2THʱO~IJ&JGTsKWuWWg#p{c$∼ s]A Q5eၢBw3WB3' S.+h"5Ms+?.nͤdޥ{@T׺~zuQ>LngDNonჟ4t `Ooޛ?VMakڄ8*^.jj̩r[ӥN{sNF}M?WL]*Y&YSϟtwi_Q"jJ l .tr剷}?)~}'ITO0‚c` 6`Ը2$2/6+4L@h0SFRG2m?3'qf@@{$V8&Τ(`E-H` ^W ;D+ A@<DhpFT41l5<V hd1&(- _#P%dheʂl]e b%?%tѨD^#hNSBjB\|,pFz# zPv+7X0'\a@u bV{F(nBVr^16͍>â[L&EVEM_d9™Z>X{r6ګkF%֟Ty#{3c{ -^ jF[6={̶$W>DBeV"#0 Tu e‡$x$"HJB\۠w9)2QJM6|8cF(d`%θ}DL+ !e|,>vX(N&`yKi4(܋Ru'=p9O=,c\ٿ@n/&`TEib !D cITWN?7[8QT~'J֝`y_|L9e8Q +1"]\\sD:ٚ!jo Qn p0SR׊G2Y )2K\.Rп~% '~B$eh7tjpz3=z^=,lLCnM?ߕ4H%M.RIT\d7 BB0$ZLN`%% [5ь((43Oѷޛx4KˉU_t0c0`k|/?;{Oa6W~b+@:Kݮ&d*۝bjnf}JH^K{\>{9PjA1iѬ}OMÜVhD`rS)BnKl=L6ZE'vk h*αUV0F/8xk)D+EП(Dà!(P =)| K: k95 +rш ` ȠƑTxI [oN{{Yhђmue}-HYӜ9rJ5XIZȅS&;YWJ *`ES0d$bRBM1";U 8Z@sUji@^@aJYd'0#jWbzy~_KeYly} gv/W\ ߷^Pk*%J7j@Y?^p]>½O|uVFXsyYn-EjI O2ŀʋZ7Wft N]U5v>3?btYg gnև{T?~nXaבy !A//9DDHQ? .}ut罻V:oq1Ct+3h'76Lo c'\SB?tƇ ]g[˓_n>Goz{; ?7Bk3Ku6ow'U:q{wu{9>ـJi? Frgr B$MZ[3ŔPx=( 3m^nP]ƀaE~pPʕ UǨo5m ޶ڨů[7J>f'Z\ΣLZ'[ttcI(;Ƨ0g+Vεysd˛,+( buX'y89"NJ]DŮ"G; {Ñg>yj9tz0%,9<g--n/[0un=Zu@xQwoMw/~I! <"[ڱHeVQG '$byĶ& |`MUBP-TfG.ɨ+?v2FK1+Eoi3JiZ:!- KoS :0b,Tj*(h$r\fc؛ccjvD*Tu֙i* ")A(UTGi3ǖQP*BaE?ԬXݚNJz$$w`-`Er; fXJ@~0çwZZk#HhQlܰ|g ("z"P狍ΞSZ=i+Ks pYG m}@C+$Xnfխ#0-sbw3B UWr͡ @Ct&2 ,"z=ZPlfV QcOR T+e;ޖ䜃I`T(\i?9PkrT:^2!*+$v@,\7n"6pp,kA@@4eʂykSNbLuv2-."-y{Nyke\ ;>Ew`g =7euK;0{[4u-MFB}V[c ]4 .i_Uk*%ļ瑪/Zg*msN"q8WtS1;ehjٶYBwKj8mS}XT2{9SйcQL4X[+l6ɮKma^220k5. ##9*z{,:Z`^nQzREZU]dq⩥'#K`p~oh,ws{QjS0\L'Kg~J"gZvjGi-ČvfGV/kBؠ1o`> r3}ZiR uK _ܠm3fLo&G^RKBcW"FyZ5,!<8FQްWOZ]`mE>rF)Jy^{t*8w\ )a<>o\8UQwhcVχT\ŗjVy_~v_s>9iB~&—dڡ}:JǨ#=e>95Y=skЯx"8c+Fƃb~^ǤDͥg.JI/z`d.~H3J2S(8R.?t4MRgJ|mȦ_WnZ,,ZH]5{բLbwZSG"բDsW:$䙋h%*w5릹ҝ+bhT|^.O!•+5H3J2uhF:MsVѩ}ȡiVy֭ y"zL J,/U2$\J$f|}_x&+?L5a!d($`X `%@1^Pwf; |"YYxolUąYVL#~{M/ѮGu_,zCvi& |{+΢Y}R$_.m~5,&lyGAEjp@E,7tc("mxK pvxoF9W}"Y4B!t_/e.U!? 2\Z~4΅jgVzw10Kh|ӏ7km*ƈ֌ꋬr&J⦭5G,A,&v9~ќ`%3ky2my UZbD,sZ0X NT`0E( ¸S̴2h  f4W:"c!b%iٿjtLː}J\O/W@mURefl ?WndDrfS\R}Ծx4Б*ʕO)"YmԱrX-S:Hk{I R('xD(`,U6pt4͗!]R/哊& /YJH*u{o>?Ѿ؃(g߻7fWWJ'$rjJ/gK4[N> 5L7'iqFl,6|DIZL&EYM:prm&wR8x!?]5Ͼ>JE%AҠ-E4I E%71ɭ495%5x5p"H dwmmJ:w4\':ٵ+M5`d2吔 )jHI !ͪ\$qn&ZNȔ!4yO-h(V:sf]%e$"4"%>A;sLpD CA. 9,Oz@ Fj"'u¦֑K!s9JeQfV*3bKy4Ae {c1)Gp[ X)^TcȭUܢJ_3'q~Kq7oSsgqI(nK c:z@q+Q4l⨬(LpZA3))έҡE4V[\2C܈cRp--]z: (p-opG`VOӭ(nlVf_=}~Ux`2rᣟeO7W07 z_ÙU~1ܢ ^BcQmX[f|6󥼝䆓:JڶY - `S A-1Vd`K%Ch)UNSF"$\x+s  QN֘J9I p&l(IʑJϪtl*Y#yETQ*%gVa<Rr0o o٠qVrdZ2 ȸgD!Y&ݖBD&DrN򺔕ecɠ%0j0Ka䐠KC>+F'3%  (t~+;l3Rլ{HEE-8RC')&NBUu+O><*/BL1|l?y;hydA:qu=n4ZǶi\=Gϔr`>ڷCKzO֡()z@d+>dO/EBaos3`+vNkhMrL*$_p[j[ ;h1FHh"Ih Y)ΈwL-XFQ#(FQߌV\Q~jh#_%ý5mOI(^Z4 (h zێvgnHҧ`eIb %fIZ$*nD{an[t.jк_RfꜶթFJ -ڗ@_tδ9L[ =Zr6ⳡZKrH, B;?@th0w-Z6?zUQhj{4&&g]hSvo]4m;V+5϶S-zS-m9ʯ-h6FI]K; ۰d;'쭻ȳ\b䑆{J^b9h{d98}mtۥ|M٩@Q& m%*W2En%(Mj:V!}39JRiyu WW'_F(' b;?=ڹ*C"L[OZ|ICЯ-Qяz*+8Zrpݨ;{ƌn)eوOK숙{`Hhp2U$:EW uW@g[D3ڝ^"kN׽ڃCS?=(U[a ' ST:Oڡ`1BvF7s)|Kr| !ZBau?3bPbX=7+gЃ,|~w9p R ͏AjKu@(Vaxcta'CJ!ާ,@pB*P8͙O\fḬ ԇu̚ڝ*x ka4FnҼ5˵-S]c($s6uȔ2g--D U\9. {FdOŴb[hF\)BeK-?I?—k&k;^(a򣶆8|ր EA5Kn~CZJ^ϛ_?}gtw̶fMҼƢe Bwv舣 ?܏N'wnikm ~zޭWs=PeOhO&B4Ҽг@8f\]ӿǬ4oϹgJAyi&_ q΢ux 66R}*R@N:7hm/V3ƛQv)zd7=մS #.E2mSFtV_{֔?MZB/>* [m(z|9qf,ZK7VyւpV2<"6)r9Uֲ#ԕeGG-.g{"`w\NV֗vNxWKf5"=JCX[[ E/> ?r@J?h >tpxIUWApft_ˬGLE?}0oyDjqbgϹ~")$@~ҡxȫ}n]g?`TLkdZshtI{NBW;TC7o&B߆n p2ys[|z٫Q~ wz"L0Tד_O>G2MOޅ~oBwۻbh'u]O, [?NƸQ%z҇M7djlf?5p):%Dbcޞ=+"Fut0Ϲ+¬Lߊ1\ t!XJx[tM{Qs4j*"QDeEC Ώc%Իk1|C]6Z@_n%Qֻ<L*àS0Y:Ά6% "QGbScC;~rv_اs'{MW7WO$*NB8)_-?o! 0TIS8xgWZDثwo}ݫq}}_<>"__ Nj^?Crt?[JPBHT5&ifl)Q9%P8&Ba[eTDP/%No[mZweI|xg()@w{2%X-iI=}#T*fUnuT*3##^Ahon?l{0bʷM/.% nd?77t"d{2h ?{T!jr@eoGތn0eeMwv~W8`aו-'-2?(Ԏ`oБ$R@EVG-P>ڹ\F_rxFFcKv^yGEQxQQX/IcLP"x<&z8F%Fࣲ$/3Sg hF?񼯰>#NU^&T傏rl$0z=Nސme(ʛ_K?^Ļ /rRmh4o i3n%^]xkv!0ېi'|ϵ4Q!i5}^=ѕ̵end\Mdmsp`ZÑ1oqN-/ά{$Xf$lv` D Ŷce "S8XC Dz ^2Ϫ dX6we}ְGjZ5? G$h'&p2AI,%O{w#ʝ._}c&O8]Zf̰#=']\t.ILI;m`Nd)u)1h1T"HE Ι77(STg#pZƉ"WW T*) B 4)`8p^nejڦt3} ϕGK+՚KMP5$l]VgiR&eaʧdRUlXSB%yB< ! n EI-AKzPrRXfc1z…@ A4IUU(a"%f)WQsK{tE\S\/9DIDDCr <*mI,WkT $?SSyFMܚ@ -hH*^tV2=lb\O<$+Dw3ArTqbRtGt z*C-Y~c|Bpai4Pj5R]Cl4AfS3 -@=ZaoZh* Ss2)G$-$N(]dmPQ`{`iDSR&.!sX]X,`e@p$&I"0^= [nzWQ|]tl68`/тbgz ukA&`r iL*pQ{wmb>E,I.=o)E:oy(/V~J;]>05κ|_| AM؝K^8ctJn~9U:9?smgol.n-sS#A=}z9@8!+d5E>}qCrA8i_bI6Yh,e) 0  V ͛!igS>M50#5Q]}TomR ;f?PdjntEe *W7\ 1uKO[qL>LF7%xZ2/|ݹXq:g+zC7#_- z2en"rGĹQ7A5r[myȵ.OR'jxZ|XltԵs=ǚeJO10 &)畎'/.9YZqf<kx>#7<(~ACRH9a~6-\f\5_IǃW[*'1, jVp7@RR ]N*Br*X9HrOhZzEm:׳6\8%<EMNU宬/j>.-9T3AX ]5s A֐ %øΣxqٯ9i B^Ւ˱MHD&UsԢ]NQkF$޹red#jL@'IӿJrOIV&:de^5Bw%)BOͬZ*y5+)zIPz1STSos׌BPk)ZA'0VT#>yB1R EHt\@5^2#)7j0q`I^萜ZqfUrWG/9ZD )٦zs%@ͫ+b>d' $ T$ &y "z%H9]Ψ-rFLWbdrVqs^' %J @d+ 5>HZ6@-DgTl~^n5g@*Hߕr~Eq9P &avB%JfҴK9'>Ts DXPz-y; Rµ' ΄!m-JOp\,WoKz&OU|($-/JLzhq׷Hܕ'% s^1;.Xtf쟼뎴,๿*8&Ws=0Iz>P4$NHBAwURՐ⏁$ kaSTKInT)vhr(#Tf;ipuK.kX/!^ 8BF#FvӔHZe5[\XSgg=:?:ODHt4H:NlI?pcG VXrǭHڔXCJ J%xg ?HUziү i؅lPA@A]'*ok4 8ꕲ)hGpL*Y+p'":25eɄߖH qL[ >Do5#e"Ap|(( e@{˲'4ǻ )'b U_Jh7']3`p4JOR9c5ǣ 8(!sa:Fl%C`;G4>vLjٴ!=BLU΄{;V]eg]ݼ az>A쑒+7{t m>Wno[hx $:f!13ϮX=8/-8 P.N>\e/Z@q\VcȐ1a[w1lm0}o)9728 JtzeVP*;dWLX9&ӽΏ%tEWpDqϡ35EY^H||o(HIOzg@Jh) 4 ]=[X!Yve^AK_;_ Y1(0#cLS4gJ ȎHTdJXE, cY0f) L1s<0MXk-M,@qxrFT#[*It*]2`!]G}fٻ<r=y_-B'OSk|Z-VNϽoZY;jWŶ^iSB-Wllnu!ۜ4#/! uƏqz~Ʌ5~rw`LD |e }:L MZm ! 5C"g5TCްt|~ hWV6?*l@hL׳I6o{P-^WĀV]a3_/uS.;fwp1]q#W[nh7F0p60"Ef_l?`zs W[n(??ش=yE@7}i'l"7ӌ'[K 4xl/6Ӂ8n{/Q Xǘ~Q[ֵ~ū#HhX9;:Fl)%瞭i^(/Y`]_P(lE~d+”9ptsrF#iܤE¦Svm0e}暣;q̐Xt~2#\#:O![\cVNu8_\\1.ogI>5?f%whGy!@wmOn'&5x&$PKD TRB%fmdOy<+ȟh˦bF/: i~Arm(W/c0ڽ /ꂔ e:ϯ6O.=Fᒚha ٻ6dWd_FE]bBX]俟!%8L  T]]]BŎw<ֆ~%"۳g;4A)$LRŐtxmӓT+ace%L~l$Px̬h)S@S`DmB0u0景D5[|lU߆ Ď[m׀M>%@}\FM0GkWc _iwB-+J9c02}юЀ| ղP|C.7ODGOBe-jEyb[&n~Sl‰6NqқOj~q;W_čԤ$ۋs/!7>Ƨ{" ;S2!BmF=4uqq!t>۩ƍzMăc.gT+G+{e~#͎6uǾOrrhA A7Tz;Fo} G5 8ؤI6o' p .?pj{ZJHjJhLvLhZ 0O)Zi0%}jCF2)VDWd΅7Xwtцs Б+c) 'IDZԨYiBXB}͐(:%*G- Tk>W1뫃 wXr >3".Ntnoqڄ'ڠ#ȱޥˋ!CbJB .RYKn ~KiYŁa\ T\M(vY Av,ųowtpJ00PVK$XOM;~Tv J_ބ|%xZ`i,hgϦ{|87_U*e73H qTT j]0C1+Cøz؞%ɻ/{|.' `Sl0S+:w9m~y 㼼%\>A@20/P;X2?2\qE5i=@59:3,G*ϗ.HRV#1Q(.R :ϡƩ͗]EKkqW[ )9e8$F)e0@icQ+"3$F++ V dVKZQҸ+Q^!r]z\[!7*N72~R6*0m{o5q9חP2׻w/eKؚz<)n~ޜJ)*G)$/)zo~~+H?2Ua0=,q/g7998\\J Q #擥<]B I )vr-bL:er!M;5GvB{#3Ӏ q<7&`XtPBqD9f/.\m߁[9>/!мHxꑏy')TRQ.5?7g7)=* 3Uǽ&قI(o#+2<9D!zq8B{c/ׇb褋w.ߺHSD7QOpz4hFDؒ1Jl{,@ڌ+ 1?0-tcFA1\CCIx܇&)inT۠BB8o^%on&Ed&ܩw@C$? iԆڒQ9.mu0ImHW!à: &m( Q7T/#cWL\?DC߻D-Y|PWG`yT"G#BN#C"H4 9:G#PW#G~ ٳ2w/U}`JYeTdmq'6"d7J~ldv3F*N)° fH+0c9l;-wlaC1ug56zYEFR 1 X6Mx[.eOhcQ1+ҏks~^TNGȈH7$arhYЭ9pӚwա^(T2[tb@*̓譕:1;K^*4I ǟSt9v4pPl.1_^,~M_NRȎw:0@:W?)|,0P:>kVm9'_}sB!2";.ͷ"-)~[&!s~$ l!z'}#0J Y`BXƠ׊ P4 Kh1 wa.+ڌ%W9RO " \RYYMoѳ9d2fռ!KL )!uҏ0GomJM)wץ?SE2cqrr]\d+NsE&79ߛU;Y7y3N[=n]%X_u뢇|]} yeJI"!#$ !;G2Q2y  t4XZ(m -C P@2AGs`-Ly.LеLU/Owc7:w Sgf1)|=^XNk7p9WsyY3rC﹘!FPtASJ ql/&|~~n ޵m$ۿ"Kvߏ=8{1hv7ʣqjEI!j,lHTxYUo^,^fM/zWOooa}䱿n'd2]wۻww4}Q<[Կw\z95=:R38.{ c3=~2uUߧJO&Gƣ|GS(s} 0g?c?!FսB->d]*O =j2dџ?e)V2-7kZ tY.eVZ_j c&0ç|z޾ю{~gc޾RG_^kriV(cr.lG)@c+!>2\8.Z Lm[W 9kew:t jIEK05Uy <װ\֔'dΉV-I/ۡ|3Ҫo`QZs{`ΑTKEFZF#TlA%B{w3YNx AW}4H<x_oL\_]``0İxoeRC %Ij$X QLDܝ>1n(*lP0dv C PcY AHEmb%X'(Is [!U$FkT$l 8M6"X ';V% 1DaMV1I>AwJRFUbM5=G-;20sl( 42qagLFJh_',|COTCpZ媃ri1\ߋ鷿w8x1אΏiϓgG?8\ b#XP+QBW7ǯ, k#Cbkdl-_J"C}\!&a |%L)-'JaR.Wʊ%CԬ N:\/*+WwWTod14QED%E8 g5Ef3ov;=@f^}ĿZ| |V|Ij;&3Y14`8i5|>ud=suhɚS`P:2㞷: DgPәK)3U;7#y sfYk6ih\ΰx!H6jYfsև"H\[g-trĞEyž9\zm4}BOUeWD .5\vlT!Rw+,0LVǜlO5_\MQ]e?UwrttҚfWfO9Vb\zcVy\Oˑg>]h2˩6M qfH~={D*gJU:O^ A `T/X@@JAOf哞+)+)[Mw^'f=~ T+ >JO\LMiNi3%>M2"H)$>}y t݇@>e_h#5&I iXJEIbkJaq[31s_Qeo~k$չ}VrDy5"&FUQ㹍 Z*X,9yքVMU<%>kJx,'xh;xbΊneٴ} <d4|T(aJdD&N=tpN9tz.dXHzPjUo}6Iy1l!PU1LyaQ74ik\89[< aB_qDQ/_Vw@d1W^*8MG '06;pDp/-gF%<▂υa>vJPH25 _Q|_/0eٖgH-55ZX}8F2 IpJp?kc%VM9bՠB2Vħ ?)ZvWxQL\`ڥBQ=3Ҡ:T%(eajdðhJ9Xt"-b '$0A]I2Eec O== ;i/5J-3 4%ӯq>t;x_'w$w؏>}ͮKb\Zn> Iܾ\0m*b~\':UhAbb@"3Ʈr2C-Qۨ^0Y{^f嚜襡"yaS=fa_,m~yAXɪmf!fRVnǗ.CcN|STKQӆeVw'Ial7ݻeK dJG><nJ0UEY :L/=^LJ:ᦂ`r =>O¥+zꁗBtU"d'A ևn<|ՠ' _f3c+3GǏ]׌̩X)%7ڕOg8UބOcfvBay$'3xӽlFW~|5UȲȕ~5ëzT2.tuyĀDˌ1x"rrE˾Pd;E0T#d`!Ȱy5v ;K:gM1oUD ObeXⴋ:TzA5.4ʛZp} g3t 8D9R+1Ŏ[ f9=qEEm>,iW]pwU V,SZ1\28o@rYmy_HNxњpZy t:H k#4<11Ӣ3LӎX3䙊ɄJ!p+ca` [Ȅ40WRH+L}>jKb @$[$\YUhK9ceD^V^mQk(Cb2JA Wb H\ߪn+J.*qh#"CUi{iVʣ M4RCqR}&J"3,8!TI;+Tc\Ɛ1a>>CuJ8(F(_71ʹX@xfz!MO~fG܆3kI=-&7?؏'> 'yi}(>[?;DLL$FnhN&&aIMhmnF/.)JWϮ8NO 0/V55|WlYݍNH"h.П7!SDqO6tћGDb+f9^h%74fx!-дkB[u4 N}ikk ZRC z?|'? ͟`Vcy"F9\vFsA{"ȇ"RM[Roނd pJ)јds)јoGqǏaEѹM|(pK2Ơ~Ԍ]RNfsfa:O߃4T[ s5km˱M MmhjmU(Nk;>yK\%E.wۓ׺֨1hUvrFO׍rZQ*"RZcd'8<Ԑ qJl);VU|:GEG IjS4i@#uS՞eǏo޾!j+ ux)Z2I0$ο=:Ũ+J#ոUa+3F\F ^Vb J.MjSyU(Ԥt.VΞ9iO1y۟BW%hpk@zWۮ !WuZ5Z6x.V]͙,[QCT!XiFUbMJH4tTL^1Dzk[<<>o>^fӇ"[j⧃އ_ww!/w #b PmehUotA "KM 7w~ q&ތנٯE1PoO B\ cܳ;Itp›6Ь]28A$ј-$Pq<^ AJ cY =Ve;}f.^zK]\*ե(7dvZ0ܻH9alO$"Қ8EIL)Hty4 7 Mo2ҘcPϰA0ۗܘ>0[)!7,40\.j U|vBP:pjjOƪ@_UXEsp$g #}véQLmE%7F]'LCU+\.ʧM8#Ko):ja> ?VZ͝Y~*j)VU1Tk+C<Nh\Ry3k4(GUvDrGJ . .Ea\dLN#N;N 2/&DH>\᫙uyV_BP%diRV,)O;+L1*=Ljreebb5U6Z!q9QʔBzƱˍFB0T8KJ2LL$c!ϔEÀ9kdư93˜C?(EfMjI:qm*1׈Q*e5}m5ymU4E`n6iR~-ImvuVSD[ݒ;;ա!/\E)`ݽ +Ҙ#.Yׯ[͚?0 K(1f攪QʕuBSEaRW_-^& 3 ?|yTm7PdTT,Ŀv& NhO D@D$*JM%Kd Q OEXvfHC5 ZF鞂׋te(H~;~(A9E^(L!mUZ⠜$wA ti7taRKJr_x-obk.ihgsSU24w#x/_CC4syڐq,HH=/K5%Ikx:q 4N̘N3\xu)/1ZgRm'VU%5WIܵ g:BY^GvW8޸eg?!P;W)kbbR\?Z*@wwVعЗ4Toi I"ch:Ql2=v8̆bBLnq_7`=[7JS\䲳1s"[c jqj%㧑hv&5jS9ʏ RTSk k9&uţbHN 1QxKoN z^a7/_ ^swTDjDNC*oRՕT;`K>zh8m@#MS~a6H)@+%Zd6|P$8qĪ{l:mRB]β;p9Es/43%͔AiaL.V"k$`<;p `(OH=t<tъEZ^W3}̚nAٳ](@]_\Q* q۳=4\:!X!r`Q4jOQZfsH9ui29u+bs *Ged| > M0L9fz @bd(f&#[?@n~( W!*d_3o=m D )Acks'3 8c)0kJ!`q,@^ބ+ 6tћ_>[: o4OCޣIH:ԭ)r*&B:L?|[s\窼Ǜs*=,:gi%1A2C,L)oH c[E3=RJoc>@cz] I'",bFބnwa)?s7{r :U@|YB.ٮq|;Ogi c<1L2fGB4 __ꬓcJl Q$Nau0d`S.ΉV+y ;5D$Zb@CqvF< F3&g/`# x8vLVXvII}Sca'V!Lg|jA&.TNИ34BNI 9ugkHR>n,`Ѓ9`, 6az`)! /$]"C+Aц0訦:X8L]H1&,,3Yx20hfB4[6s|3~W yv_ec8-Emz&=hf,[xWO>۱/ux<^'/~/qIQ[Ȓ>ozߗgco`ójn<ӘG|U q>[T,P,ZL` 5q):x$sr)bha΍S$"t‹dfNdC[8uaY؏6&L  6À#c9lm4B¢aeswy* fxZ"|rwʵdW0e]ď]{~Uj~xKV$!o^?~~Vz`RNᣟ`DA_."H+QOzp<Ιx_3{]32߇7uwn`xs71ӷ{_.0K+.8nz)C*aVTkޢ-bUwfh A7 9s 7RP & _otfm1\S1͝>Žv`Z@uguen< 0a.hƨ[h}b E0PRRD &ta$t}r@x!$UulƗŽ_RPZK?taVРQ~0 ̭|qVTJN_: rx=lP *ڽՠ"P+ݗX$ Av<d'>xM^uFt1.rsIkoF뇤\*r'Ed$ ,.dU}~5ɩb՗d',Ri46DPDŽ~Z#ђu*AhPh: \ XF $f! ;?G=rwͨ7YEoH DPJk)(uG+#\`$!sN'߸! ∜}Gc(3d63d6$,iTL3E966*b5 տryBH_pԹ8ބ/}3y1ѓܻ;ǹ\7djvްj:u-7!΢)3X͗gͲ؊ eY!F^x^5cqKYq eD!PFAl HDZ#lf,I,WƥvZ,DcЮ[qh D.CNnnT-*n[EL)jNG!J2=[L!n,!9ɛo6 8@=6?lFesd#3\vնMz;oz7|?t?=<l7K5`uP^gЬY K tCeV}zu}j*bMUQOf1v1@^F5%c f+NcûHuVeScK{&L%n+V粵'۟Dhy~SQߺm+m 2MUZEM۹{z=56[RaPin tD 4_FCjSqqM53P"R_ @41p(18D4i4@v1(Rwk+5S;*, =fpo\Œkiu ݛ5;4s&ML}`᝻s19uV & ][2~xpfEōN h ]f~F9 "9I'TeJWZ[(Kk@R'Aح"vzY-<=д}'\y<ΰ#DQ5ՍќIp׈"CZrUJ[f[_ eqY]>>x"~o$EJ-2L,ik e$>f&[nj_~i6r9Pvufr D^li<y2L3ğ;&]'ߢƹ΢X(ො2,?`: - ڢW *ϦviU+im%t͌"]S&H;m"Gz+?6vKz>gM}vCP?{_ڔ"?{ѕjn&{Y_ԌYWՠLKڳHs˞{ᥲ!xϱ=l1Zs(sȶ ٺ= ۺWྒIRRDžR9rJC}Zwe0 H]V}YH AZkLʢ*bpҒ03J' P\@JKAMA2RZf+Kcs Q߃̷Y)W¨ d$ZrN<:W.[86ML 6NG$ w5`9Qp ˍ;E Ey_tGR?NF1x21Lkw&8sƔ3虡M4體gWWFv#$A#)/TITTR$@0RXpVs¨It`_%.ef *hI Bx -[Uٲ8٬h|;{կTvGcjo̽2BF&3ŊğH'd^V+ͬ,͵ ('Y(R5FR~z"M!Wz,h~^t&Oܭ*ή>ݳEp_.z5`HAʯC\/brC-[._)r@ w>+7J7N&mф ݻ|w`;B 0MxϓCƘRp BnS|ض!a"LbG%ȧ/l{x=Sc pbVr OG HxG(+H-A@G:h+kХ[*#2[*#2 T¢QiR 2erg"7HQbsP`&#R9};c2n./?\?Lu* g䃜rFa+]$YfpKisB: -u)2K22gLS/JҏԗlP!](Ӹlu>g #agXk6F>VkZ)vF)HFxG,U?D jw%edst5H|:(^b]yQBaƸK%9-^R*-ҙUX&(1m]U4PgkcC)Pڡ/{"1`fڂ%+2%e!1C@NrUf\URbJRZ`E )pX"B>%0S=$0#">.]?,S_AN̙W\Ʉ)ʔe{.5$!H y<}e+b}zW?>Pw!LctۘwSpSA \1%ciiʼ(\;&֚&ѤIIh# yu0cAƀ榀I%XL|݈1^8Y:z^f%zS,/g)|&E=F2`X7b&Cע;1r,79:p -Sf\FyQȴ A=QbhGwM87X5' } `1ۦ Zj@0qQ #-viKlORGRIssTm`B}/ EUL.K >Pa/MK`wwN䞘_BDЍ+=dLiCնt㖇 .+ d0Rp*]ݪtJJ65K_q|?Za%KU GD|nGڜpfe7=GUIEGiBJj}ȁ;T *1nƈrWLj\ qT6&e`@p(ϒi2ILQ\2%@ԣL2<`uj{T8 v^"}*ܙa޿0lͰrի-eX1l}Mڭ!幂*5yX !v=ԅlƂ*ǮuѱӭOb &-f#]~ug3*_7Mo5 k̀ypVC&b!7Kr7!>_8Ρ}CYʂ ~Y/sf[GɣK dPS=_wݝ)ݨR'.ip.FuG>x^6FU% NNlAkkc:C9g)5fe\.S הw0X@G(K_TxܡU$í+P8aDӭ]a2;u3r Z]kGMPŖ&#MҤb AWF.JzâMGz[$ n' "К]9(K6'FRʡQԕ7FH= ZQ+SwsEaԇL-ƣMY|ttdn lhFnn}Fq E88i,$GH$e+Z9#Ap]~M?1p'R1q-ѥ N+TJ IChtQ]fϽ.ap'd M]G2hф$.epxhi*s[+pV /޽{hzH쏓lZ\n'a4,Ve y!p .qV|3guWճ?s3|ͫQ_~pstGg/_2dR4\6BGKda%rlYC??{fIE跎ד1Nޡ8ϫ[u$tzAcDq6͋'_/z%qq~Lq,sEfƳ_v>Xv 37:}K7RB4yڋQŌ_[!(w?Λ6ZY^r_f7 yȳ[-N3s΍;w)Ah9lqzrl( PE[4HQ`HCBQǵK#JumKAFZgF8CBu]X=h.CDtU,44]R `$!zCƘ[{ lA|>y:- YQukP52֕ 荜,#˜l]|PS ,9kטH׷#row0|y]~/} sE!{ǁ]6&cG'ʒ"=,./( ;6P^|vSQ3X\l= _|j_~<[RcIG0<pAK5Edy 0 ΋Ǽpq^UC =9.Aw16[̼8=Iqۜ`ɭOc'af6.!>+7겁ʏ,Y1@cĜ-W%*>XSq6S(8;ʵ0{ L2Mqz}䈆^3]3;΍dr^+P `uSp^(j`FP&\ZG@AsJQ k v- i~sqCzn&]Ys#7+~qL̖72Oe6(Ymv}Rb<v%Q/Do_z5ͻGGGm a~ȔE amN 20X w2s PLGykaVNGu3HSp.^awM'_di4q?ؘ!52*E _G^B0Ó0dPEJ +?fC~rp֏njOjbZcX&w~Yg/N'bnfX? L{EQpA69Z]mP1?7LVJZS߿G 'T y!Â`x%s %QY2!`j -^I aMEoc-,3@'.Lu{*q +rG1p^JuXn 밤 g+N+4|! thԊa͇ ǰ<]()7JYrmzXAyRFypJ1xnTN@KCyzHmYgjJb yOn~8ax`)JZ\UBz:PAjt;I+P)_u<7oaԀ |IsyՁެ~ՌįgԶ?jAUJ LJ^E'&قiҰ.Ld*ciS*ЂF3dy<7y\ U"JJBNjW,p1*|x~hݢ)/A(Gie> ¯̵_~zy."0׏ݰ^>; z3~ r XF3²r!|#8=_߁=(!jۣ<!4Qr­~|^lv?/x'P!^QG>b۞p0>T Cn)j CU$,AZz*Nխ5~vǷf@%]aVtF6gd jV H{Cռ8@}`HF~aJvsCΆOv%}^HR> r_ו9֫ة':r%jSQ봆)aMkHNjڧ2<8lOX|sn!j*J3:9=6><) I 5EU3DU8B96Q>8}dwDa B'VAP3m5aaE4ÀgzXwWpO^k6ҝ/3վܼ2S%$ Q*A@]Q8"g fsG0%Â*/wyV{SU<֋/ڽekfoz֙ J#>g`}+phWUKL_*TY+."`"?~T;W%R+ƛb8ɛ ʄM[Q_U}oQ_oR> m:@]u4K[~4ulQR/% } ݙs}A2QVë*B+GykXATMVp]o{JRRٲ;W$UIuTZFoHN֛MG ~(ê8>٧1o'SRv5bP厁fl-uf܏`+v&vY7?g|lvbzч>~:?kt칱2Z! G}uX^LJ-d:A2NNyv-W&/YywH֯_XOcpZKUN׌ְ^4x]cRG[W5)]w*NOSCU&oodt37:6Kz &>[M L[GkA@no6z,MM+PE7[OV$ө9F%Y7;VRwkBD0?A>Ngf9DKJ;.U4FλV-[7L3±aKPY/5-lL`|شCl5V!)hϧ@Be'w!SZƾԞ &UQ!qEnU7GՍ˪+sE\j(9Z,Q%'[C!\p"sKJh:Wt݁jqE`SΉ_i_ " yd/I"窩@ 0x#r\hJ)J E_0hyNDGzoGz뤲u8IC+D ZΙr+><P[ᑜxnRyEB4@\sbWō~S%Ů%BKT0W`nka|01$-O ./u9nH!(zODɁ=*,rG*ה)n)QdIH k's b-TR\z4qI] Ti%jGpy c Ds)p^%&R32U)p03HyJM#Lhp<~ƕ Ie JHXyJ"alsx=:8uKlUl%iD5zQ6f `h] 4A+Oy{ZN:,&`{kRsaJ Qori`ipP([I!"M[d\Ox1=!1 9ct@,V%S?@l%-]h|gJݕ =7Op]4C]H2uӝ+p QwU(GȐ\#O)%**L RiuTЖjR({G5L8Ʃ:WGh!e(HS2UݐW|7WDyJYpywe.s׬vu^%+bٰUЍ+oIw.]64޼b|-ԁ^a=χ>l>Wr`;fBHCh=t{珷ueJ':{WW*@ܽsM@"|A\}@edJ*—:":Kb{"u;Xiʣmzr&GM&QjK'ql_wI1)G0%P|`Z*63rHAQ9 e|E48(z; w/SJv2oh@2 "BA /UT$%܋Vc(d qQnQcun2>^Η5!J$*+T!˂D([>Y/H9)EC"e7Ph2C8S$Vsm4aQuIN5P#ŭ h[/Im wTĔ>^}14;"zowVʖF4-F ^iΩ -T[*!_7PFPFq!UCF>?7=*U.3 א=ExYN5JNZAXJ,MZ0p#]q*43qbU@L\dx ctrwnDAU`[hfHuܥ)؛#;6H} iIKӄ6xކEoJ hnMR+BW|X"j#iQc@ƺj}"}V!O⛠ӟk0#jgg8&`\%k :I.$=\P`0uhU{h$Dƶ\b JwMhUYBR.q: MA!!+rY+mfbGi;y 3[MM ,e \RPf_$cF11)T&Bg5G/v,i$5R3H JЄA;iv$o15C>IT[;-7Y{ND#p@Ihu@CÞ$ mw~f[R PH$bh-*b#) I %iO8`*ŕ]cYJC{]*26cͶڙYAȆS~QD`DGP|n!{S 9 hh%W_[tq(ޘ4P gf?EZ1-\gx{ڊ$~<6KJP>qV%=%ݣkNi 5静3 ZLڻ:Q'/]m'${z-!Vx $ׅ2WdBI4EeOUK%Sv2A_/L}zic$:$}x:)f4<"6$m#ΫC^0֦hܒ fGv;3tAx2u}ްOYij8-q'atƂDif;U>}_ >,'Ӕo'#rZԳ%*k%RrhKJ~sűuGir|yU&]+5"DYܑ%oZEpIC/Wh M֬ M\ ;\Sݢ﹬{V2sJSmz\ĉ)ի$iJrET8^UeLM{ yѩZߒU9DVjO{JGSslv집go__IF鯦_r?"j'YeZƑeZe'C`&WR%Xf ֹ- A ֠8Vhȹ^2B]9%sף`O|Ga͘Ip]~t$jgO,|/ u24ڃlAN(*TtkhLOΉq+ 8ҩf;ՌcqT3.w)B$ybOBY& Exxsn"0y֌;,ixjg֫X xY8wmmz vI"@O'ĈݗB_m"2%T.O"4Þ$tWu鮮Nղu#M AszPB,2hQQ PcOk.eIn;J.Va!ՠ{;;d65I2n(\~STuU]$ [5wׄAHqI'SGe[$Kr@Y;x$+̗'d٥z2^`hC3JY2ww!rn oc}[p9*" s$kƨ(%YgiɀU^9kĮWϗWS.ӐIO$>[(J(ٮMt#Է>[`:_ ejh,!BN׋l ^ G#!JT6 L~ł‰5+|u:XmH7_߾lԲ~]~&'o~E߇'k}0p̄mĚ\m sGZn%'N(b˜Ҙz`ckC30T{r{&M=&A/estX XM=! "[3`ՠ?P#`R*Gфd @:5",!gy6$U]VR1ڃa܆1 p) p4FkQ1NmA0-.JG 06`.4Q,z F _na|Lt:,q]l<]ؙzq&D_^#zzne[@}+REA^/J̋WG?N?<#2. ?~|}1bʼnw77@|;B3_|=Mv<-qqA0FtѧaDc%fT^,STo5/+):܃D!O.4 5bG=6oN,>=m iZ,Vt˕檙SyA U [0wIU hRzob9Fze#ycĖYНLo}B9R33cLB\>m^:QI (4A ֮A ' L#Ry1^ d`蚼beᛷ߽.6N:Q⟁Y1 ( -Ġ gs֧@{$!!V;ОP.wƃg+=QE"LWhQ;(iLe{lu@BPCҌ.jTBO[6-`!ŨK9/].nYr>'QMDk͝tNI-} >"^qI3 e"İ=jJxXAj)=DxcV!mgcK)RB0wG41an]I/ GDjnjqB 41*-3V+ ]l3zge &Y%xgÛ \&oA&%+KAca$OPy40FfG&"!l8%2!d т3Xqk l*z'J;3-3J?`i6@#6૝7vt|{U6I+tEQ/58gpRs:( 5`K`?lEc0)#F'g!1V쵢􌰢X A0EFS  JZ2\$L eS0SkjePb!8("HE ţDyiP|U`>*F8V*nZ LRJ= ZdA)N#T-DmoS#>6*)eD4H *>}bG EtboKDSņQ[HH0 V'o@1MAFfNګ )ZɛBSB!mSt$C$]ʗ ,?T2SФDRTwm2~L^TT<:SASn9IpSwgsstҮهefl@>um{~,>¯~GlpQ+&Zu[*} 0FmަGXXd{S-|]FE jIc~Q}5JdQ,\WwnY? W]|EιÍYUD?gcnrطoqi+~;ƒ4rů /NM<-zMٵO!C*B7`T؞_HAH䇷K¢a`GhL.NnSn41hcYwj.$ ,ܶj9Jx_ c/w.4؄۴rIj@V[2Naad#K `ԡaVm ܤ-,+g_MZWן§eֹ\k9?z\|>0f#N-`:soJnxLt 2(g<ڰ N.& r˜]j%u_5O4lE,4=u~1ަx:[s$K8ٷjH )9FÃdc241%\vwZ!SXc}N&V,+>.r6R>.Ѩ`4ho.0_^2xxxx]f ~z;&6b4h"+FP:m410JD4~4Ŧ%r6~Ss2gb;94I&Y|  S=,u|8 ){jj^HHKoIH?܍n>O o(hy8yg.^f_LЯd] zsn^7QZ? Űar c8!|0ik4 ES_ꭴ(hyܻ|0M 08Π`∔:IƘ(QY(\H >sS3΀PT_=*I $T 1Y8`] DQq;vsU./#Qƈba2BZ8WRAmǘmLrIׁB$bK/ƚh3&[}He͖^] E}D@j)I1 94K eDSn F#S܂;[BUʆ{n T<7Bt }8t 1{U>`\r.I_|pxnj'2>XC?,7+7a1bqZ r/xﱅ?#x^ FV |}3$^0Lnp?*kJ'Ҟ.ָ!"%{Z1p swlhׯ|<kqYwTQ6X)ސ懑@!4DZr4S&peW/^3mɿ*fedtRFGR Te烳qڳ-9$ġN=Ƿ^=c!PMibC2bϭE1šgW7hS ѝ4W:[yRׇսHqSL/m5 lRa##."a.FCh$Jei`V"vBɈ^ZG+Te*7(ԷMgKoNwgw8BsQP+ Év-L$ aI&8DLI*Pg)!!Jc7(g 1^ +\L3x3;cX A ?_߮6_L-a};f9Rƒ! 6kcnY;ܲcrrO `~qq~J ~MijP L+M2BK\B `G> 1}ky9 c};dPž$hT;igGGNxlѷ»PJyp ẗ́H Pa'03L'eWb2A4/ojԟhIb3jҕYC~o:aM×@Y"MhWzao30orDbuכpuá 8Jx:G9T,k1/1&-4 ccusuAA`%ka Q 99fޮl*T0]J-UdP"U9pA9X*]%iaqx7ȤMr¥9Tk6qx&uFt R&nްyotox,pGŎhyyI&%9/$'Nb˜VAuX”p<"x,(.9|9HcB"F)Q ԁR花V, 7fL j,o.;~L}s?w*czUw lص;+G4||hnWBVE76v/̪+8WEm臫mM̢>O6Ocqt kFߏA= ?7ҳ%w~2k?)M΅W""Kٜ9]{ NXbb>k,aQlŐ?#(9=X#`ç.Mc'KB Oը,&L02L IJ0rYץ7Pމ\6PZM)`%IZb WNBH9]>~aӴ6~c(sSs"Enj&z!ḫK1\Ω=Ӱ1;L3;g#̱IJBЮjFhI2$T1eKfYSYg=;8|Lw!>-8P=aR[yc&\ϺFjfHs%G5}1y۹ /mWH1b:]ru39 BF>^lS,|>MdVk.HX!"]YAc<,֩T|;4YUS j /6"h6n(ĚEB@+$RSku WStIas&͇m|KyŇ=7$Y<PFD"1W%0I\i-Xlr.BYT`\% "72Wi)Da]`Qit%BcʏDΰT %eHd'(!JX`gV..ljNEC{UmIIk@&R,G@HqC.Z 5ؖ*GȖDbe"0 ܰ (/0hά)s?C[mH$`+_ W*/,_%kE,.{Cq!I^2f\SЏg14oc |g_o.?o9_ Sj.fYX$tcu+X] .DvZ 5RW%a(*o*qN5+Dڸ\ݭCk欮(۫빈mu8_ ?m֟ʳ*EcHR\/`wCֻ (7͞b5ŧe^ uZ#u7VJe ֠(V+@d{U f/2f Qb|4ピ*qS*02-msVyoATQ!:KMYqQ,z]o(Y>%3̫O`-M V"G(Q Xp ۢ`KX@^' x  x!xcC]%@& s[{3Sh6UJ e U6M,ZP DpHzMmJS$ga {RrsPr#H[w WšH^v+\^~sjVn:uލV*ΒñK؈Ç|D"b Ot1*! f  ^cK%%Mp6:eHixBC09rڄFInpRi1`+ 0H "OMlĠSh''sͻȚry]Bet- H T R{O0WX}pHŖ=1i5䷼DְTu-n7B3u;ghmius7 9|FPn%T {ﬡzmPY%Y$=_>g'7%+懴svQ}UA렽1{aw1{A}ؐ-,LVn[OO-N)ie7Q?MN2 rѿmѿGg`3F@Ws}<9>< !Ɏ?d& CcOG:7;W4("' 2ht'yԠpH.ŬNCgGG?hD!v힙i-'h )?h yh {0i0qϋ orAь.i^zp?CL,4vc00ѧf&c\Dk?xq<mmC2ƏSbID#]cx,Kz7ЁCP_t`?`BK-;.' q ƾSq z!)E ] νSˊ)uE-RW+f¬PhMҰ"?c @F^9SU|<]٧ۻF .o/i/_=5^]4<>g3™Ș4esi$Ն= &J95 ¬1m +s 4Ld]4&ULyƾ2uJwZ=e F,f#h%9TqJ)A@` Щ1V^5U.ʋ, 1WqB'/D0gT{XTRV)Q\,VPn FlS.bWS]MWGssy:uT;kމ.T[2}ݻVύþxG{@k |XIt+1aSS=_ݘMwzIoFZ?' 2+wajA IqFfuܴ8nXnMfq69QiUߵ40k,4FGD)xtN .P/vxU ,ݲfuX"#T o䲗ֆsetwD4 ” 狷*cd?KK?_}/*.pYَoٟf8#]Ilsh>'PQHcK3. 6g xL&$`K ?Bܞ>K5Y*L++ (%j/)3-8OxnĢZs-mc_AiJs(= _ ^)ߠd_)]A4dŰ=aO4a%X5Z꨼9/J ds& OWaBqaCE]:+ƌ Z%JCkN2cmwFf}I!3trf8ɋalRf/t]n¯vi ײA|h!Sޕ6r$Be֘-*CXKc,v7gk ,R%f(BDʨ/2#"#"#Ł,J#_OUOi$6\TM~˾Tg!~@g,ȷ4>]??P[z.{SX(xa(h,(>}Dc0R `>qM&xiWsת ]U7B]HW :VĨ#/.5*_aP-/cD˨~ӓہbcJP A 1gJJ:p]/{Mӹl_j:t7_-Zz9~v>aы(v`f~NJb6x"#g".œk딳^3)4r.FZJ)W\zм~od&rB \KrG%7W4h"? ]kat"a~l$I dT%z(qLb`8UFYΤ1Qib|R)e!z(aNLj1-=ϻMlW_tKpC fl*B9L˿1{Ppvƃz2Z@37#H~㫒uY 0c w]Lby%яi)fL{ǻuz兽Tf-G3`&ף^}4YHc7yx7I >Ks)r33E`wWU ,N"Vyf&&$<aYiul_ 8L  A$UW 8hdca"Y KJn0fD913Gv@NM?x"6G "D襢V;H!6RJQ qTHA k˨> ވMc;{=֠Q{\^ ѯ6''.- LJr4enCÁa" (Mx FjEoĕR)9v T(N[^V$[e%goIIV_W -Lo2ofD@6Z}z d:[|4v{CWLgӻ<,Qy>m%gBTST c|07~ʿn5Q޴v`uk}NSf>ڒeWN ̄1ᤱ(b *RHt#4 08rDHJKM< nZhQKH-#mNځ("l1GA1xjeuFIh#fm'H.,Ur!9XA^s=E!D@`1i"ul L3D)CH'XivFXQ#(9,* 1+06 za%FgU=Y{eh<'v,O0Y,%l!օD} bSI-en0l !&S,J[ngŽf#ד̬X8&ӂ|EPr vG-XNсLgo q~ԕ vLi_m]ASkCqv*ᖜۯ[:I^L bю1ΞQ%aPΆaBX5c.$+W.kb?5q8E5T.?.Hwuƙܠ kihBh%Wäӎ7RɃAbjʊ( D;eȜӀp&_TRtF)+M?xzkB`ꆡ~l1RD,pX^F bDeZmG/ 59!8HX@;,Uxⴶ1\F 壥B-_(RFijB(Е:?|bKW{IXJwQxMD)(M/)}LR,*r +̧̓ S= ͩp%ʿ()+&+;m=G2L'ZD")VsAHՍtg<mHܐJk#}4,`bM$۸3BzA 2žeN24Ywfwk&פVsVU+lGր^$pX%&U\[XK=;bR~Hd<C )MR8hܥKb)Qu(=&e3j6"̷xֈK^,KRԾz힕 3RxYօiX V`LnMn۫ջ=XsG۫D|F|8?f߯R95F}e~׿_avmV~ph+ߥs3p0}Lk>MOwԈ'H̤I7u؁zIZ@JZƚnsߍmDQBMq^9ڎ~N-vVV]TJ_k^ׄ$yJy]>zm_x uM+OZƙXƢ*ˤ^aĄ&FIG 4s +1[W!<`x2eu[}O01L ^q}SdZh6? "'qA›}ͥ&u>%>R S~?HGV-rrn)r|"9BD1A4䰘sMʻtKeB T^5UgzP_~8M>ԝ`lnOqp^rB#1n8iasL?"I^OP{NgZR T8κ Z5WפUd*ZW;d+V"Z 0\aӅ$RyƭmY)&cmTJF9R+7&D">*[uE:ed7UIw?}HkN15A YY;l᣶YE*(? K' uk-'!j|5ReI |BpRRN{" >JOp2`ɤDxd̖i`=0[swVa7g¿X2׼˿5BZR!uKfB b+8aUH*?s; c.ڛ"3x{| *?gacp mgLnw{5:vlA{8cFsGYώ8z5Gg]uxv1_`ƥ B”P{,Ӟb|X~}a /G$}Hfv&095) 9 jtVW7v+Q}D~[p ] &)y?ltAs(}Y9 b7Uu4c;9ϱC`yk:߇t#B}4UQζm% 'x{’T#l|"ZI څɬ()\>ć0vϏ&dkE~R)Mdaq;a4ηP& EL8uI:'TcQ&U(ƴ)eyNjGv:>c&[IѴHj&W^,y$ :f.=M·+⓵͒vs XꪯP" t;1bk]7uł.eGzꊗe n'E+EЧƊq_7ДOijXo_UW*FIII>okjL)uB~r4i ҔT\c&D FH+R e$܃kc1"HjI!(fm:PM_5)#D:A!lgUjSuˎ 7`D?&w$vǣ^]0^/N-+z9OPHw:r[͙ZdJPMS(:} 1Htjs4f E}\  Z%2yCJ#)y6>BC!!ӟrԁ{uNng* +gx:#7*(L)8*,qM%9h X*RO_,xr%b,"^o%jѨo^p5B ,00)vsb7>:gDt7L0WhIA9%p%͟/aWqM䌱5 (vDk'_z Dc'̋=+|ŴU2wfqfg(,>ZSy疅g$y,㪎wi`E%g\ˁ]e,]VF >($p٢FK uُFN/w6|iT{=PNT_ˈ+,@' Ay |,/{|9cRuz|2!RP4} }SL"8/5)Yf A4XMg5R11H,6SFVBccC6HzҲt֦5Jgm:gBɢmdlY2'w&*(G < c^K7I|>R2WX{Ip/L,k _MtIMW^rY$T|&hՔm9ðs)[Kar \0=܃):<,S,Nfއ>z"aL@0-2a^").Ҽ% rZA7J[9%3*JWuXAQDE &dIYPjb80L\@ZvE./D& o)pm")(p k,6qWq+U(iYhf mb1 .ꅅI^Ku-!MN|gAl)R8(n;rr "nu 0u`i<ͧ_ξA81Qj<`^ ),BZ!jPm݈@1J4qɧYPPzi>OCRˡ[P72iCKUg9^%޻@|/qkU˷'β`3Ơ!Wj9)botd%M+4Xq4RA/5dNjw^c%){`XZR!&Cq}ܤZpzYXj:G_ݫQ}OO7nޏzmܠ9`Hbֶ̿1~ofig?T8eБt+}(h~x/KooFӛşηj8zX'gwfz6۳gYgnR@ `Nos8L  }s3|.tO?-N6>UkNNΫ7v2vmpjӗ:ER<ǠWfRB|7)UpqVvN~$QxGо=Ȍމ8'Jt1hqG@ӏfXw+O-H!^ ~ǛѡGL"-aSLMKXOX2[4vG;ྜྷ"v -1%Qxs 7BZNDUMoVIRD.jftQ̀{*gԪ*I^*IV'G;{p #Hwj龙wm\/bO[4P7M0 +YxcА2b!_A/rҦl|8s]噛o]SfjAbQ۫NlCY3 [EzcژKF% Đ2c6ٝJfÌT n g'^;0pq}"{{j:"H ]ոw.gFj0*O4|[Ό Q'`VKp%)"B#MG!J6- =nG,*( 'Z2N%[.Ե9ƶ᭤͠&֊h(q%F:'scJ8C.Š"XnHS a0SZ5ܫ@'qRVva$ZS K~D)y e!ooNVޟ^]x3_|dcި+=DV^9@D<95S S ܏bllmŅJ.YJM$el K,s,n $\ 'w &t$,Uu Z ͊4 X8h\!in1'9mMhb)Ύ̩"PM&̤R&ǙE'b&δZU$C75LtX{ט*״c*H܈RhgrJ܀^8`UY#êx*8S2ˑİgyF˗o1ӡ 絩uieUp8rT>d•xt3UїrMd68X%={##rL#mK4E=`|ujTWE7~ >wFtuW2~N -Wz|`r<;Ί*Gr3J3~u ӛ ٰ _+1mg~;?$Y߿VWjIUlj*Wh+uv.^:T}r+|GobMۛbH(&jnwcOl2gϢMVTj3y-$pt2b$*+T1sz(R'x--@5rxU Ajꃵֹ: lO퇏]B CJ(d-#!$65Xї$WXhS )Y@xn! ki DS@OjAI V7H]lo\؄fFHz徜ϧ0QWZszT.m'9-oD֘?||EǠ0 "*D4ş'0+Av2_}~jwLlj< n}$pm A ~GV2d;]O`HՓdc,$yC zX)\RRDbĞ&cZY5@RmhG[<ڪ穭וDFpA99W܂i68!m%k3*L9$,21,q>XC}NARF}Zhv& L//|2oh\~*츥D߫K@b6i"ěDj0\EY_@@u\bK'BL [<сs/aXk 8>̹ 9nk^=_pɝE@SNCOfYlٺd~J3s+92y-Wܡ4sޑ,㥳JE/ͫYb,F"ƇIOoD>_$WDki") eB $t(RLiܱ?d@ )ш0|+tR(g-? [qDV$t*e@ZI540PY+ n |J2I{͝>Q)VԪ:,IwhGRcU P\VleK#NDfpJWig=As1  AI Sd͔r1S{8i|JM*tIr+{}7!jf3GXr*rb5* d.Xo%y}e!1 fUr}5pU)9jo0f賀ZFzC[Kͷ}CU!ҭ ߹.6:\Xԅɋ:9kXkä qfٻm$n dX`$_ D^N2 _Q~ٔ(rO'nY*~U*UH MvHxR.I[X,AT-ꠥ08"+7EJ,~je pX;|$=vN_>J6_lPdK1zHZ sC sli&̜4'RcA, ǔc !fr ЕE9&$Vǒĩר>䳞FBmdž,3䙃I)sV]`QJ{rdb83GsAS'(;rpSj!tUOzO&*Po)1'3pNR 4dk-oqY)|ȝ&e*e;НkZ Q+ks(Irɓk>k1ua݁Y2ERi«nëoΡq\{q]*"07?y8ǔ/(*sP5Mz\lpp*=u pmS`Z:tJc]HuD4KRQ +Ve*L8JMHb9!c<ΑkyY 3 R M|2u{XFr-mQp/xMƻ' ;W`F8:$Fg/w9͒cc2?^얶E,T?H}B0p͊_` NKg$Zy."_8$zF1ʭyWH˦-ǯ)r>yIЖ,37%}J[*1:,T޼[@c[hcBRo 1`bn}lbjfx [hƎ cAnVu} 0MS,L(_ow:|5̺bdas"m^Mvdhj. /˳=̟^WdkLFȮo]v7 ƞO>(ނgLAg_fa.)^ӻ$K j*~ۻcSͦgv#-P=Vwc`w^ n4sՇjvtA"uA&#FU,xm=CIJB6d0ΑRDSe.P!#WA#Ѣ$.1CU>Fs zl"&+)es0"*f|ӱSQ[&%38 2pr93:hG@X꾎RET%=:>T'i&9X/;\# i"F:Ys1Ƙ R͘/akH{5t"„`Σ@m0?1.Y$Gճ@Q3ii;R=UJKU]zNN=0܈~6aj}:F"&XJ%zϮEQc<[c(a3Ȥ@.Hw}}Ѓ SEH"$ wܛc_Ls-zYt.+.YVLwRՃ)i1v5rtH*p.qM*?mEJv'#Jۻ ۓoRq_ׁBz(Zw53u})8"ەP5KgpDkv_U/P*jSh1>iI%.ip/?)`^I!,CgM%ݜL5"c$38*=F̋J2J vnWwqa63e1C2%&9B \[@ a mKRjIe+XdJkك7:Í?Vh;-$4` K;lڭ- DE}7:c+f M5L%%M*%EK8DiԒQ@r~]~q_iQx olenEw eϋD쭳&}ׯF~ [)r! MbhǬҀD:/k3}~@`}?~7O }+'-q賔+QzZ5*npiC|OaXh׿߱{.߫{9G z3 c՛CD#4HTo[1Uo6w՛EJqʲH)ͅ-UoֈA@00<?[40}Q ,m|((jsl&B.~:opF{l0~%WL*Άє#F^u!qF"+² !(x f[־]X 8ygQTy8nF1 zga#PzˇrnG>x0! NK}g6g:ϬB%iqn:0rڼ3L$[jTu%ׄ@іS[+\z Lr@2qD3ΜLP6lj e 5 }"\]X%M&fsU` wִt.YS1KN`9<%j#alr؆|&ئ4_?M8iңFikvHy$y"DX|͕[< QR'uD#IiQblW_u)=Į}}\l3,Ăw-ӊ[5u9wKW)yC>ܻM"G_~id="*gWa:OAM_:e;-Rk,$EJhX.v/ \3S|KSY(0f Kg$d0=ŴE4/h'YC\,obYF_5 xzVO/Eф=F«]#)=KU9.ۯ5`zj? ;JL\g :7|c,$|b9>y"6lŸB7şgx3q'"Scb{',bh DZ'".4M)hPpwȳdy8KViYqΒя9k>xL$=_2쎁`[lls1S䈻k.߯M9erHu18 woZ6g PlޜgGb4'3?ii s[P,K_G]" 0)~j ZZCb.j )z'd=Ĉ:H)|h ս+mQkҭ:nj64!$M]Q&ZK:zTZ˝u5% W^\kJ5Uul5tV"n:&%:Es46Hte]e5_w J Rs#Ufd\;l02m%X#ɩ*s<"Jn$,6BKvJk%o6ZΩ!D)@;Ah XCo¯e솽}ZZIie$ԥ@|Fk6$:IR!^x0#҅a5H>!`Ƴm ZVqbͥکn%QٞUoo}{ &XfB#veƏ VZRCPq3A66c^ ҆uh9֧eG0 8"H "mD3倵ЧɧeţTqTozp5cAf$EJ[oa.LΘR#:,6st47XRI[Fm< d&RTv?3-wYDp,s6ARλ+/-W`!wd*tm5\;5ab3h%0\u匠D>Wch8vR59*(/,y(i8e>jSTj!;# ᫩#"RuWZԒ)ܭtjH b!o #.!14pyWdc3!cr5GkS7pK"Z\ y6 TzG!y*m}--Awe͍H0- zj{3=DG{6oHE(X"*e" kX#\R*:uqi*0R n'NoUp*n{Wj jh:pwd2(9F([.vWRiIP\ CgBNcLKQ)H)$1.062w`ϐҚ^ Ck0JE+eir)98p1 ϼ )bH rM@^o BsiA[:jsJ<ȉG!Nsa967J/Z*!U.4HdCilOHcNc3͑b/L)aP<6Y±jPA|uhثD9pZdjAOA&yb@f|.Mcܬ-OLֳ+UVvHT$FPXHl,(C}aEDj&( BA| !< f2-1ꛫiM.L\_˛Hj^l|X/KpK?-wW))/W[^'b68ϋet 0)t8 0ƿ|F拙,c-%7pgZQǓc%z2Be"`n#w?cDh3P=\w|mS{`8]L$,gOOgwO>G86BQ!Q$H4Z}E'f:ZE {O߀sV `_okj_{yd/fJ-< Y3(ɂGf%{r:k"VD]v=Z>Z5CRtA '5ќ`I^Zdb$pC)$R1B{lH@ 2cXN$I$1fZq7_'f|8=]z$rhC$*Ju1T7A6OQ.:;8Hd3w Ǵ;[S!Y둉0)KT?"Jl3\Z#1qKc_<]au\wĂiYoH؁gGLD5n ZiӕvxUe\TAmʏ$#1O/|ײ?faBqp؁09NdxZj8-; mmo<94vUHݪA37D-s΍oY1*)6jmO0:B ́t{cIF0|*{@o: Tudu8XjIPW?h_v}^; ҸV"ϯ+$cАw;H/DP iPgHWҸVn$JVD [Q9l?;Pf}uIKB HaΟ[e6&k\fޏ:EDƈr% Kpm_;NYl{%W~79⧾>G0-"y桋zv1hnա> J"#x7&nVR2Hi&_Ž쐰pJA6tu`q&1D?gDu!6S͹d|PM%m**ވ4xįi Tc a;o^y=GCςA"h?UBgv|ą渜^[Lt q jƶT ʎo7OoYe=:vzz:-X|\-օd7|k227W#\ϖ',l5AؗK./1} %۵ _4}ǚ8wq|ůU\,ciG>k}DQ PN5^KZĭ_X.ޒqm"SϵF tJ$iZtTs8u2U !o\DTP=Fp)lڭ))u }MHNڭvkCB޸6R߅M 5i:g$S-{n͓][Ehb`F)W,1icWKd)5ߥHۣI ;son%aFOi0frXK`P;jm̮UZ>z?&133{,qE'gMxKzI sg)EIMBIrrP륢vaRDiMzm0kc mV@0D&δZfskKՕkf&9.ny+<$t}& %Ԡ F2f ( V\rgT&\)zii}}@&\3LF#x x*_FC;Rwu]+/=||5*O>V=i@b͆GLp&W=Rbj;^L] A7u[bUL?ehHf *Gz9<=:]7R R%:ėc]%你x۝*g(&rAAo)B.zȄAHB5Bv] Uyꪅ>ВR=vnVR0N,Ic$%t|LmԸ*5 P;]X6"b(^ELHz?&FçȾ̓ fރY~lӉ޼:ޱ$U޼BKmCvF"c`$*)*69F r4:X%Eѹ&99Js!=> HY-$#3i"t9h­HSwv{{=?G5!l Asnǩt%ZyJ8Dl"k!"e$}R!Y:~Jn~zS7[TMz8Қ,bqߥ2q]qHS.z*]ǀIm>l>WCZ"ġ#{ْ-|Xj|| rDh `7~r?YfJ-o~ꐩ'w~Ty DlxJ N1-HF.H&xu =ͬ{l](v*쪦Q}ۣQҞ[S*qSd 삨%p0 UƥG0!rH/IgHLZ1li-?'S2JٙS˅ׂ+`S%$8k=2WY#KeQorx %v>lkM*ׯݖMc]=Mgdr3'CMFcːIe Ͻ%x/!}QPh@3q3č xD='2/ϕB3(ܔB^?WHOWREs(psZ^]5J8|Y"\^`WT[3u#o=Ϸ K`YR#_D#ʟn:m/P'6d:ݹ2^&3۹q~TA͍oY Ne{9֞;@LƘeFJpL[N_ǒpX4c2M1%0+y9}=eC"qbL@|5?QTE:3_b`y!)^i$W%1S[V ^0E BRt]R vAV3 . Bcq?$9 ,KX"`9HD4 pKГ1g5PJAQCXxTx?Ĭ@y@E0id*0ALVTЄ Ɗ%V>?16[/z(_x?nXAXJ?;ݷ*=6}=\DDHV] Aj&rÈLлgf˦Pgf~\*ڀWϘq]k۶+?m hb6h~9 (J'9A!eXeHIAҌCK/z9agg~B=#(T\M&}uh=k'j*9|鑞+Dg͹PdiUUgٍLcy <99"F2iR+{t@BK Aeԅz ay`a^I N.߁W=pm D,W"$py z/'] .1Ro=/[P%OYωg@>C];UFR# cQ9LMw&Ij0D1JD>Dl `CI@>@H9uSd8~׼u]$éÐ@]Bg a֝X-FT )ºc,h: 1A(Ҫ$QIiTB'âQ.I C[]w^BtK {{&~z& *5EڹB;ԃN5K޵%6&EEn͇L [ S5r-D Cl1UZ;2)# ]⭁Ʌ-~վ- o~71ۥ{X܌.!#GnZwsR8yՋIi2_ڟ>KҝX&?MfO:%gD) NABPܸH9Ph#>kfI׼&".iV_4Dx%gA}ze_B h`K1גx}c{9]Fz톫I]G`N<߭'^/,~AF8^EdrN>}Vm`kӓ-7|>6cbSp6,rKK :=Z;~*ܵ۫!LO0 BagA&i"G}Q|Tn5aisWbat㛏Uq`X ݫ~Pwj)(Bt}ƝsX)E.| 6*5~E٬)hY!>wxW {}?zY/@ \G;*}}J%2\T_#Ag6{o s4W[Iӿ })ڗYzž;_rԱ. БRԡ$`֝QTH$4ե;PDp84x ~&*&xBhO崲xլ?uR5C@ y^MqG{'6G|ëo_W|tRL-jK}t( `ૡ"+O٨ڸvS R͒_}בZJ !,  ^ lX0^W ^`A ԝ]_ ոsOR!?RETr{J#:jS8Icd>!~23TX?ݿ>tcȐˈ٧B6߽(Ab;(/G m8ywǛ(h@tSOWoGgcHTHHֹ8lnށCpjVT`kkj@`:IhyGNZwNd=08㼣wFAMݑ/ ih˾whקQAb|ub:W1K5إ pܓ_c#c"ջbѱywy=1=|" |߫!*8\U! j i:vwX31a8tw Q!NFE hhm@HI k CHw^ ]PMyz|̓,@[p?7۪yW?x=̵; vTW FZ^@jWɭٔZ-Q.NSl}Xx\ݾfrZ^❫O@*MbRJ$"5Đy cf󼝒Gz>If/J4H)HPCT\+ ZJΐYSI2nƫN)tAs!wzV )Oi*i 3Ph&8#=n:pLU/jXjrnヤgeB\) mK|.Wʥ0hA-0u3Ns)if%fRFVUUXk=ΞJA[PJdzDmdu.B?2 .ːV9WIa1sN{' dZ̞?ilb m^D[gs;d]>] '%7dҫ';YQ!AX3ʕ1vؕ8R+):G"76RX#`l~Fd0t ˲Ov*Eіl߽|UW{hSV'={yZ4.WڹdW:ի:=viaY*qꉚX/vͻƮ5ݭ^Ac!IY7$ $+HJ- |O6OkŖs\4B%(*Z䒹X6l -'J?{3)VKLI+RN`;)ߺCT_u!l?3 kனr4'Cd 1Q#NMFrq& SP@\L~a3{Yz=Om5,udz m6?$:KUaLI)jku!kٯ[)CcqFRgj3_n젋O-+$'"(0v>-T bͲ` Q EJ%\V&ѩ́JW_|q[|6_n锅O&sHiV@IN0#"@ 'E%ldQwt5r.!~|2&li ꄆ@ȝ4¿2}9/Yu6̿g@n-☼88YK, AS>8{gD ==N$oK~x.QѻHƊ#@"03L8=lM{)L!H>-,a%ccrg67:cWY|O7ZV;;*JS'2iяOEBd07ay3,-+`Q/FDMGK}?nQx/v^8¹#?I:˾ܴl^xW7E\ bs=?P ;F^"F$%$2$7 IF{',KY03V\JЖaѹo֝OXݪ_ԸK De@ɪ rX̟mYrc;,Yx}bOSgDua% -cXDeUhYͲE) r_R$^/u^]ӜYdZQ2L%C dJrIRTQ\ (seTM ~G "/;iܩ["i 3%Sgd4 j{iԌheT1i)hiBMj+9'f) :fD(Ε&Zcnoq&'t ]#p^tTp+kNWɝ ]Q0⩢9f~$RN܀τ *͆4^IC 3k΃ !֢vHs&sPS.!gA@F-O8Wz9Wfa_HA+GtvXUU t,r Zn "0圵Y?yÅGdǃBD (BDT4kM'lzlQݳT$!6g5 :ث焼Pgy$XO^N DP"Z9Ft3QJ*JeI|fJCȗ <* 8`4r})Js!DD)kuB"-zYUb29&Mp;W*WD'! Ye.9c(#70ȥt1B@jK%b,Mt!8&VV͏~ETH>,>U}6 HĤ%[Hr 1iNw"-&.i*|D =oTgV+;P!T*ec5Y~? hE@JA0]Lqrō~;2ǫ/{U o+<Xt^GxqZ.fԬҾK!)8Ȑi^RgkPy!yjP/ @_D߮K8=RBry~CcyfXӇɘ=R(k,#CM&r[ZQ˦KˆJ A&ϵd 97ʎLnX  F*|K(L/]ne)q'b-_mȎSj| wd%|:.1*Ԩ;h6XAn8&M1jOjɓ< "2b$zSqy- 2rS’L*$$(˰RJyBCh/6'| nO:.lxINX@%6DhvIOO2hH/|rzց鑃TȈԛrvv?}H3Ⱥ@:Ɩ*7]3x;Bx]>B/g歹[ OK3_+~5 ~7/)O::vW5}Z0D&sV=FS/3CC;avOܾ O2F$KYݺs.aRrV֤N@ l ݽR!mi1reTr3zfVXjFwx\׭Iެ'~k!1x)5 ^ߢq!Se(Ěѽr5*&=&qՠzZtV^1] lOVB9ʋRN7Iv%g!͕.+殅\Jf\͚h$Дekh%E b_hh2Z#/~Ue;^ =Iyee;N\MD H}vVO\UKbUˌղW~&UT3w7n3ok5~G1>3;uh&*%0E+7yfy'W'_QykD1]5*8@g(`kvZ=(xF#o-gFs )%c#4d!SLez4Zp<1?;!1Fƴ&:Xj&;-oc9Yu]M[<$@SJP"@$4:E")&*&BݒHQ Gz*a-IFO C if01ɕu`8FMwb*=+ x:nZ^ !iF^V#>-ђEy#QAĦCˈ`iP(=st;Te4r% x8ra'GN16 }~-Es独&H׵>b!^҃r'"?/QC#zdol{(cFDK[P͕`\1Z _N7J _Ô#Um{63-HH=ZH%Ljg#%'''N0 7GH.1Ғf^(F)R>RZUrf[333m=ZSaNBpLf /'BBVP[ݑ ud(TVQR$åf-~aox{GYbX䢃B gC6;mh\ \Jx cBRjҽ ΥGFkB `e8$ <=9\j&GzkGCPXuܙ#I)rd54[ĊO`etʛգ/&ӣKxxJ0Nam>z+Ih?.IſLiE0/ھc!yUj?@l~o:&(L:/0o%)^xTt`~5L OORR}i&v.Bۧ/gaJiV&/-̯ MMI9tû1Fbc:mĻ1*Jl0ֺޭ Mpɩs^FN0Ұz ( dO$ާjmEk{r7$ T\޽F> `>c*ݱoyD!ʫߨV;M}WwC/HT7Rpe.nj8%.k;W*s7Z8S_k _덦,@_M -_EvNQlwxd+՛ $<[wܒMHUYQĘR_:V ~F&Lby<|y ¤:LgGƣGțې*HP$#mCRJǹy?JJEϓo30o8%鯔yq&ޤ&/g yO OMiI<7Ӽl"PAԮJXCZ3X`HP2jB0'JeO?KW딂Q&XMھjLVb$eRcXOĘ/?؊%R19Mιs ;g) ⤵s>doYjv9U桶ay(<%`O̗-WhG9$zU 쩴‚/xK`|},::+46L|z<<=~,Lg$q[=ig]M0y^lNߤ&7wy0tIpͤ"֢IAi/ )QK}@\E(*>>Ϯ;H]V#* wڎ*%#xy_qz $t#j1b,-}-ѫl4oHO 2Kǿ#Ճ'sqAԁ_0zh\t|fYޏn؀%!f!Zh> ,weq*h0)@Q5h/t%x ߟ6ѧ<$9[+_LUy^ΑM k]'U_+_@WDB^붆*{,q^#y1o[KӾAO=/Bj֚a*h_ф+Wk+W\=ETj**G@]Y-zqq( /PB)cN1bsGBRW006@iRqCD3)eS 7G'qѠ5p#d@F A*,< Ufsl TS(6j«2hBUfr.xRݯ7'bY\Ζx,OOEޒKB5~ RJbzV6S(?.DDHQea3&? pO?N?}]j±&JȗoO3.9b/&f6vvv/1Q1hM{0ZREa Ա"L1cTIpFew>[(Z"*z%2-M\41EK`<mm9n1 x,F[A(A7 ΂1.!᳅(m(D&@I)TÕ `K\Y[i'" &1k(o"3B5-C4@9ܥv WVPb5"YA &bR&EgUڨhw2#_h|#H`)0fXרF\:` ${˨Z %!N$Syh1`>F%bMMƚFa=C(LӑHQc*:gDtE+⤠ TVvҙ |].7o4Aq/>;﨓X0IKą5Ad1D{ebt0JbBHGUHW?tiec. C55sj=mm7zuKMVʂlvkJٽehISs飹X1o8u%jѿ>ϼ]lYkU$K' ,qF}5:js#n-8mEaD<J5rA[eշAR9y׊"}]7{:P#4rڊV]W6cփ49]k3ߟg|uF- ox֧=C|5 ܔvkw[߭zyo^ެ[/ƮxQoٞ4zoߪǃl]JzhJZYJCIsVe=6 錳#*>OyGVQ1&xdʫjX'1tIdcg[qt/^Of$rT }X$v?roޅS7 {+Btc'} d8Xwk5W jUC̟dޝYeyJ҂}dYu+U$_I# 9 ,lCUϷR_oҙM:SIg73e iPo uR'tu@ c2#׊Ğ+"bDf q5~zz( *D]MޗDDʷ9sycXg/?،T>`INHńH"g&sh!V ǻFiŶror KaoB` d8\j`G3j &To]Zahb*Yה^y(rZFy)868Y`uHWX# =EA) TR?P5*$FmlESfkg&b'`)aLŌDIԄ"TVNsB&{j&XA/s\ ԙ6PǝJĽ;@>w&1wfRa܅yy9Ě,SДZ+GFtsx69'2\`]1x4hO7. Ll xo/*sp)ގ2_ɯNpu>,c1?%ো7|~R>oۗ8Ɗr?7-XGJnR?Qv:*;y#;)sף+ڕ3KXIO#{(W I+R ޾i7FvKA褎zYJ[DC[EtGdLI=TͰCR5Cp0UrB$)4L*z`S:YU#'^f.%vvT9|H3fmVL5erh,Uq&NMM77^3 >L fDXc@*X!Xf#َd K(i#$G\`M$g@jQhyC"H "cl NN N2 R#dk+wN|[on6z:t1./n.Ja{a=9R_K&d+,1M "%fpW7fU+\|݇>bgjt`>GM2?39zIpABASPHDXMA!)}U$Mp9$ӭ+͆*)ga9a($*a03q=ֱb&]U?%3ggTGyKe 3|tKy5|x3ˋlb?`xo V4bi|w? }pRh lC}][\,Uu,XfZ\ ܅083)l9v[`ѩxfߠg?.W@ϥU2PؚqkN ĥT9ʑ0c3h<+4G5?0mտHw(Oo pV^Г;%Fpg &Du^ /ٗ<";#c.`%J~*N1&[(CQg/HO ٛm*D9|* 4nӳ)F+Uw! Cto@0׸a(gG)B}7:*pJjB J >l}QbK-`#YecN% .C.H4w{qC1,LJ3>+-=f.>"@LQ-1dϭBR[ S\2c8NA \i;lO~V&T3~KA6 ML lv: h9#[N"{ym6]Nݢap*}pOuh3K-# AtdXJXl4g-as8к%<%BlFd5{(S-ʝmu&\>H`/X&Hy}D d3Cj6B#53e#[\X\gHdB36/#iA(K)D@"9l'w +$s$jMpBLpi j&sQEsBB<|lHH&ceg$=?.ފAH,0/sAUZƑ2g",w!ʑ,3"DndfWn#|('~\G@p4pbxGGD`59{:{;I"\P^Sև%bh+ ,9R&3b 2bnyWed ^ZarıʘAVf/xn9\"qSkKE H*QK\Qp&~>&ae0CL0.${jX<bR&e@NJ1~Q"EBާ'Xӓ%^"0էRvQsXD30Px gA~^U~5̭05fqGFqYD`aq:GqW$ǒSŒ2-SVQ={ۑ/0;珰?dtS"C4JfHMPӉ+TO&Ű@^¦SPen8'6JçdS-1+DQYj%}_+wJeMYNȨO5#RTS u1 ӪS]t!azE4ETP-vcr^أvKA褎 3 n{Xĥ[43TUYc+=1ꋇ-6OzznͦV(}ᇏh^ޔ9-=M$;̾!$%{$Ri,@QPOaZ& @Yq>1& 1Z 9ڨ>UL)g0@Z,aMouI%cQPjl{.*Yrywu~3 +@ q9ZY.K"[v=6i6[m5v}@ƨz*I@L? q<.Qcg1ƪB܀ +"v{-uoYMdFLzd(EBwhQs"U^6jZrTK1ڴKYunYO6Ƅ4ܹ MCЈڞMW|,>6AQ)Z{6YMkNOG-3}t;+ hӄ :._|l4*cU߅tx+/W^z^Uw{?ixy5 K4α,,ɲq1q.xAqevԁތ֙>n; wOn7Kn'^_%6):!-O_*?v[0>fHi w:CD] +ל s",6>_fQNv>񑮇>VH#TG aoF؄pJ9b[X9+O0:a *¾^;'0g+vn-)%{kI~ D9LIK,WQ,Pm;q QQQd>))~{ؽmwl), &Q [6w4@k$EwyCpuY N? z84wEFSy2D%ĭݬ85l g y%f8F ඐةM:tw_.0^勬݉t\*JWq1n3UB|~׆XAetAJ]j9\YGxMbIXJ8[gⲭLNyzg Ҽ,Ƴ!l C9QlH:a /l@$JW.E}<vL4IN)e걶TBI\F'[nHWzٝ C+/kkagcvKOt٢:LQqO' $"%cؗZXȻy4n!m 0ۈk@4+m28TB##Z%F &io\p;[ޒVG(ec\r H= !LV h96| =]5FNP%#XeLMn*pc#d.(I"l趇ZKL0&YF,`D2zxơ':c^h "m:xԴu-d_F92Q99ChQ:t{RI !h`lb 57#O$#Ge"sYtV4׼($V1kxTHA."̄d OZ8{@m86R;ADB9D^#Ę2)Kȭd'3du;l1)fYzϠstrP/ æmCZߗoYzCǼƶ'rZ#KÁ=}m~v *S)'"[TeGf\eje3k/xjlM=U%7_v[YsE"~f:TegsN*ij M?8(܀lz~PY"m*S #.aG,p1]s>3o`inR9`K2 1xK$1hl'/+]8ZxR6!HLl Ԇї<9Q,ai+ <'B4yZu%CcbT:(12/|Z(7%x0 3;7suqU1vZi~fm3i`bhJ25P";dh~A5rJpWh)Z%ׁdXә-m6t&JXLo %a% `u]"=ul݊_Iozw_ Il]=^!}Q*}sc_::yS'?ڟvAӻ^L6݈N ޾Kw]߿oWFo킴>{l\ };qT0PޑyL3ۿd7Nvv};piF|6>=9B3O avrdvtn2u(({JMrz.M''5z%z4ZtI=Ś6^ BJyi+vD锒IR+Cn!>bHfPVR꤀i#8+  hO#-h ~ (tꮑG{n,$r0JJ"5gp&tD€~T ξWyRhH!BP.y AD9Ȩ}I#1"^"ҦHπeD]q_o8,F_ǓI9Gk8C3Tkyvq=}jXt_r.qL_\< vs'w)a xݟo#x·oR1;x']]z$ns+J>;Ӈ|#Ť8~FWr NrFзQ&=y uU-&v{V387mRL:)VjAJߚJNJݘ RgPKv7-”c(ٺf;)utA: E|ٓz|tX RUX'/ njiiroh5 z\4:}kZ+nxƿ&~SN9Xf5頀;bm? [e9b vw*:l"HC^ރ \_*Fu_F_y>;fgKyQ}FڜĎo&|LmRwRH"՟TjJb0_UB Jje FfJ͘SEFU"d\k?NGt4o 3*#^i%K=!=f3Gd{?|{.H~;4ݜפ*B;].uxqqm{08n$5q9ӣn3/W=em }ejr(}kԂMJdǢ6Icb\&,T&m{m,o h_OZogH<%Em[sc|(wer츤@2lHq: qJQ(9En\QЏ+d4bKLMuZI#%U-gJ?PB-^ې)vRʡH)6җҧ>[C-´Raj!{釳?=0+-TԽ FzB#50iѢb>o7׋KmlG"9RZUɗx!0 l&y5aiSA25]W@6<;` 6+ R g#49;ooUĿJΟdH!B8A !B7![۬I \We/%TF)[_#5ZȷYa ΛzӫE_/ BT\[uj7ڡ s%||RƳʳHff"{Û `hXNRjmk8Ⴁ.pNVڣ57%D'-j˱$ J%=t41I,!e@DgSX`4.&CF-Gۃh[ιDBe6xR탞̈{"ICPAN=2퐨^lVιXxt2iV@&R&JC & ÕqRt&wq{ BmBL$*cUgEPٙ^LȜ6 d&tPjfKmL\0S*l\@圤y1q5d2D@,nwmq:}j A"UUǵ?c'$/8RHaF=\qluAKAu'5:.rUW5$mJTөfO:AUZ\ baMHXbE4)/#^9Ƀ%y66v5"& (nR9'c#$K|eNۀTtq| )eMD ܞPA%Zg\{q8N[ Pն3nW$ cߦa/c\ǐ\2Mbg$|^i8`0]-.aұ&٨O'p$ }vvC/cG5^2.; 9jh=Lpt|JHegUFP YwȿϞ/i+8[H.9%q;ud9T{ '19qɟnsHf@ L"zPSZ5<XV{UY+`ܿ]ܯ{UaA L+;;Uu Omk琇iXv%1%(`I yxgpN.8xñtrUPkWve`*! H 'wa{xW)XV6Yr"7Po@'EO-}9[8BzCeN QjXDK|lP% lr3'JH=oX`OgE(ACBz4bZHUX[akuVٕ4ۘ%AMn=oXb,j16AJj0ŧYD_|\ww7ALr"}zۍ#(:1@l䃑HF9=8g$S]t*Xy1qo׵ĵ~sH5k 'T%9R joUK/QKyZwj1+ek)vc1W)1ٿ; g5f3uEcIv QbyF%C Q [ ?< G'fC|v /9}̫@:5*J= Ll/o1j kz8pY@+admi~*A$8#VcNLxЖ0HeGAێGWߘ4_Eʥy0\|Jח`O?}?u/$Oܼo\95r}Dɺ Jt!gtT?gs<'qJmS?*O]r?}f9YFiLl%q~6s֏WSǻ/=/^}$[n췚wVsVn.-]>/|>96wut&FrZ<(dϿu=xW0S| {/jl` Uye7Gj`gNUc/`ڃ錕ra#io/:}4c9֒rδX;OO\ũ 9:wf mrpWNg0zCɓ^,.>qʒ9,Wu#ʛA-67Osm|$xDvce O x1tV-r_v}#s_Iu9܍`xi@;Ch,eAZE<`}, BJ$Y$3<{NHE͇w9$ vƼgNvXyaI X ~qIm'wNz7zjmyRMu:M`I^iZ اwT*UK/QK~s-jGrRt}Zeki_-=waeTF2S4)eQ$f$Z) ֱb&4.EcdtwXUSXDEkNlJĤvsxc*j/2͠Ӌk1 vכRȂD+%U] U ?C"*5GLNնjoǎ^LC:LG)AVhJNLB Xw>IA;)Ts'|CJzfWҁem9:9m!Ȳm]YmhњKZ(Y!nr}g%A!_|,'z c|-9]96 qۼ#cq5u31~8mI8Ƒr(d.((B~đ67D *=kȸsaxec9ks ٫8zS.gxv7|/|y[nQ͑;hIқ#7*n^BUn^ƚW>Xg,'W9vE7`9lT8r#F`y҃Ex`PbΕVE &,dD]ZKd`:=< XmKOLI{9%.Ə+d-mgց3m^}Y$vo2^ pQ[ "~^hK'Kmb#eM~sH5(M1BK-&dAdEqSxxh{J=j|16X @b{+(ET ܾZE-;򯘷:X] M:tjUv1iƉ.ɲ N\ <%5փC&n?Fm Łm{q`wH{ծ%>me!ӱ_p`XM9YЇ{'^Q"e{LZV4K=f<:ZwClܐ2Їql[$Z'=$`V:^Rd<)(6(XPlJIAhG%3JܬhP^bD:PTl8Zw:%Zw75;\) -ޛMaGKGl7؂0FINyQfn'oWBPOEQ^eW63a(5RKwz*EzQGy귛T3n3REzB+\mXz/7,Dq߰ PR5( rYX*yz^bq]7Xp//lU~l=} E~]gy yfD.\͓o>׃_O_=ZB:\_dNrQP^ݽGj~CMMU8{ϲw >w0=A lOx ߼ۇst_l9|rc%ҟkC$:jF 4MB/"QUz$`?r Itd>%ܰ+ӇĜu#qHeS"6Y"GVH" 'ef;h>'7^_Wg>}}@@1S{hF>Yzp6:~C5Q՘oL0`6}UMǜ67Fa"T(ˇ;Tw19$ bl6R2yNR W(S^ (=zX N4P3.` IqqɤBR5ٿ>D+T˲Y"w'$Q[eVRw砀6=nJ))2{fn+$%='k8ewu]3ӌ؂; M-  ١X\0$>) N E˧Hwנ:ӏֹi}G;&z%y*[eU^P^G{vv9XoI8k)tզRJ8J*wݹ,)OW/3b90yQ2‹0 K9B-B%`^Ef^Y"9JA Ubȵ/|0y`_ 2:!;9j 24:@3cf41O}5t2H@l5#llx}t+ѵԖ AP;|1ɵ!@PiT@XrɵI3R7>$~ B|']&4'G!YF9}7Zj)l:3'݌Zay-Le $QBv 9$"7& ic*#[}>-xÅ6չe :^Ճ\v36J5~z)z+ Aua*c3==VxT 4,`YG [>\_ē^0]W>s9ϑ"ƫNr(ąAto篽e/oߜ|=N/2.aXNoTlq NU4+[y\^s=WȻB1Uu?F1aYU;UK9@u˪{Ⲫӓkq7CUkT q̿eUiq%9\LeL;2xդaИD}2 5Cuc?"eIYmdH/w3 nb'd1{*u|=Ļ?\n5Ezո 믮ê;iD4?ӑ62jNe$ܞdvWxAp8y"Zbqs}k89ܡPT X84&faJ̠*Ԭ4_/7QIw7L¶JtzPdP4.L8KUIPdHǥ:,q5qn;%Rt]]ء顝@ji\9 Cq bfnS}D4x{ss|8f2Ȏl;T3ňiDlΣ_'Dyu}no sgK-/N]ګs{+{8sN^Ͼ0uuA;miAZWɿ_z wc /뜔\ġ1cNb0wtܧտ=Þ+ |E{'60 dP&"_,swUM̿>~ř``}%|)ST{qS,Mƍ"C;(XLH;Jm% r"&FtZTص\x;CЄVۭe ;봞UIMaUNI98!`o4ԃۈWk}T@ 7$dh5=/qjPZ(~ʵ2G)g@P Wבz%6P*Ջ$DTr+/$,V`@kj6ɭZ=Y(W9c7'ꍎCFJpxaD& F4Xi,ZL8԰ά ՒIzxXea8Ft3HF&CL^`"0h,dk;1- h4l@^2mNE͂d539BuE{ص`TG- ,qo$R%44DNYp4dg~+74ґ4zX|$24lu$2)ˢZIM͈u4t}>U]?ROiJ>+^uX?}W`ajA7c3(\o77?5CzXgsWN$aU3N~ӓt6_,*񺆀pQ>/`X8D Mɹ],GpxŌn'f@yzrO8VflxZ\ʕ=^v(FUo(Fi"VY{a ar,cwso7fn_fDקzmU7۰vtJ-ju3(w 2ۇJ@W@4)AM; SyںrPw?o=n Tp$륗ˣ;fOi˟ Vaz7 uHh7 wbA n2p[a,=ܱ`=Ώ2m7;󣪐j5CaىM(=6:Vh mE? Fy6?Y%k9 MBANdA3qeo8ꢺfn6glJҹ H|a|is\y$=\--0c)*YN蕐NVS NR՞wax%s[*NQ&T* JSNp"@~ EXS&#iy<,}x e γԀkcRHPڌ\BmHhB،{nˉGŢ$v3 $eP˂.QgCD6.R>j}BZ˷X9qH%~Yz6*\e C)*"v WD䌱@c ڊysјFCƾܴ!Q(@1U_SxQQ4*4'7Qmy$NfN>1njd2+6<,*;{s퟇D ?ssr9)i-Ib~\:h):,nK0,nȁ3|pW*OFA! >d,Xc=\ "cvR2,Eߙb:Վ KNpcIaܐ01 vh-%`#dVGm C/pt6lΌ~ G]P{(RKRrl/@RfR$PTvKU<@"~ ky  p:fpXKH>YP))AkX$R{2##(1춆k=RԶ-seצxɐZ{dmW]s $Wivd|0b8һECf^rq ELk(F,B~m[V'AT!EC)qKO%U|[׉ޔmmbxsAraJ"WH?}9%x;UC3tU |ضkc?hO|ʋ}Zyޫl=sPC30vJΠyQƫ+>#@ǭQK3 z F=S @ "'/uX Qztk I06ոny2L|͟sVߴ@nQ#P^?\pT,_KmHPZ- 8/ˇ)թ9Ɏ1i}Mj룛:7yԩȍM7YIe!w}BzDw!2L#h^evcsZ裑"E9$b[ 1EWkW 6<+;F9H+.^=LtL?_ +X6H-ץ-KD_ѣ:h.(B">ZGG2Tif@(" p `eyFpPx~83xB{Px'u}fA4"/((%2n8UI),'9E WVBBVdԲ`b5 JAQF,`L9R8/8_Ec[isJY>z}mWu5ro彶_Y3 4_}V>suX,no[kGi5whO喅_*Hp A%"\V6v"JQm"Diĕ)+})#nXS^|pi EVb-9% *RLd   Z0,*EZHM m*L ,7SXXJ~/ۗZEW.0*T[gonf jG|Peazn揅lnemӪ.TBw1{rű֩~aM~B˙Z~2`jSDrQeRҹZ>\rwLIδrB;ٳJ+5(ufԥ-xhDCV6S7\[Ѭ[^ yS͜%[7' RVS7[w3)绵u0CݛSrNXf/K13OTff`6_D6̝cFcwso'v[s2|3Oɴ3* *(UAT] $\e @70+.R R& U /!kn<R @CVYQ%A\ɒ9G8Jo|er!K-,#q ."s6,dWbbrq&׸rZri.; ֵ\YN$)s-pP[2IQ AAWWJ^6ohXԲ#I8QoJ]Ds/hT@FIԲA<$XBAQ ˝pNL @ K.c~T suTHz݇RK\`г )pX֣k18ӽ*! _*͖_QcD |rv_~3Y8-;;!58X7 ˧ Z HA:kk[/T+`u]2CXDòO\!P.!H1 k8Շ"-|D㦒x:s )v.n"wwZp H71IP.p 0.0 (ȴR"u7u'iTkj_` W"].(\lm[IJE'g[vs7`JDo9aHy<2t 蠷A#S^SP FF=d|^y\p4 >js>[#ykUjnlVuiY~IHJL Gfl @s)e+1/p(L(?gz*YA+!sLa!L|'}+R*IQ1yF3JaE&U%T"#b΄ b{j3aV"zpdp;@,áYeAddIY*̕ۥ+QQ wH:ISb%ϞsV3cԳA'Q*D$i9MZW{_(/BP*6Ğ|o@1ol~s݋rXd\oey\ 6,y.Dft(TU>p(e9l˵;hF)Lѽ.A$,,V{ g\vސ}0n~N 5Yanpf1]CVIxcugd7~NIpSܙ4œ8/ۜ c99Q~UeW((&1\=Ά |z Q1@nQ;oO<ߐgF%4b/Zw(!UaOE|)@|'} &iܥ{uIw&̈́uIc#%c(p2ydV H(wxD`>728&2f(o'v7,}og]z)6A|3tAI-|SnV>!ŶfiC2B\w/Nd'NKs{?xJ ǁ.M0 J#l}$[1bJ7󾓝:τ'ŐAؕrQ$ڿ :ft6ڪU* --T(/2@Jr($# o_J _bHLrV q ) ax,5)V,(2\ЪD.<=~1/yfHE㆗cPgmDm<=(ۈ?U4_>qN!cE?v> 6zIq>$t3-/}Qﱗr.sv7ҕg}̞߇e>\_οZ}["@4Vx绿~ٟ_0k[43Del4x6_ݬ̓r\뜀UN W|V5("CBHRptwso'v[3[7n7<#<Zh'9b^9?L1r0hiUL  ʊ!`%aq1R-UL%7JDy*l2uiv5vvT٭%{#q: 9͹ QvjE'ִ2BV4IKR<\WS?<=t )& X™IƒٲL84BJ: ǔd*?GmBi>'}άk  DoK$*IoJ"=0Fۆ-?;~RxVsVǧQk/e N&-If0zf𔖾E`q2n)`#T|{ L ?H7<Ab#J ,|u}ozGֽIl;Vysj\#'~%:'db4%hOe6{{RsK }&6@TP-`#'#td \V.dsXuДftV97?q>e{_Zj4)rH{夒TO 5^kU o9A5 D)lLPeL<0 B!_P!ZGԹrQ MQTK1WRTQmgA)I{EvXL75WIbBMVh(ľ|)_p;5t\sX SKҞ,(PZ(} P6lY62?J"B p~!f_uIY=̞ՏZ揥6TsvZX(y{6&(Zї&3LV nzbrUtS{YLdrM0 nkX(U5юc-{`,tO6r#"vO2ߋKlA3mAI߯ؒ,٤;dgVX,P/tK/ztpk‡Yvcrl{|MUB*epSc;En\[I3kW{s"aC8|?SD[fhq;caQ9P{rk ժ@lSkVB_jU*UdR\ScPj`|P-fl{Xyֽ i؆4RnkC%hՅ}Ov~N aiX& E8cFSb*G۹v"2. dKEp#Ч~K(J_ T4g#+ykx;# 5DwPO>r|vd$J(pu%ȀhXqXZOⰼuQÖ>HFGjX쎁#9 38SBbfEH/u,@MڡfbRܜ&BZ[{pt.d@4Wa/na] b'L<k$I%9:L9хBӂ[(!X"TrO$7ZgRe* #b)eI 8PR dVRs+E2݉*j8+Vn?ÿ)m|W/!GzhF|k؄*{F>| VˆtB~a_}zm Mٸ[Jѯ1rrA移lzg/!&÷]_Q``$ Y\mC|(Cbt1w77.ء! |8Qjg)~ ~y"pSzww3ݚw.frQ 46#fwByK׫6ZnT^T4{Xnt|klR*l! s]Z6ßFX Xse@r_ Gq7_nvS88>|:Hl/N}QpϴZj\OWp;¯Z;( ypJ=Cgl.ڦHc 1YI(ŢʈW`ѧDϼTT̔,0̹\ҘZaA)ƨ82hz{QF0, F.yƵ K-Pj̈*ttd9%$/v|DCMPh /8m]g>w7a9_/f7',CZdPt4h a|'6KC++p"n?zBR6^QrTTES GMW|~!Ae~MWSl{_ ,WOjFhF=!\ _ʜv*1*| ?k%) .~CorAr p47#Zs9dŏ:8_Sk|@ ў\uƌTUrjFCQ - )3 %їhhRY`a*!>+_ UY ^Dp(a\8"aL)-ݯc?υ,p=xPFp`JQ 8:*|85݋y=)QQV(I4]] ұcXʓ pKqK(ʆ_њ6A`tDIQ/0YȪ ǣ}CKRs=q.# ]3~=!\-r, GJƒ杒1vՕ:CKpMsBzTwX.>w~mIv%nӛ$loz=|7ɾu(yKMJ\*^gO"z754Wᠡ&%jܹǴe5zc8y:W b(> Q<,F#MW5 w׷8J#&fBuNy+FI>AjIjKDh)yEA,VR`;w~Ï8Lx3.tŧR ^댉*ar+1zdzb^?x‡NW]IWe/nfU mlOw߭?'X$~Ƕ;.aH\f}r*JXw8ے,r NcѮjP!o|)=n-!C';bi-iۻWz&FIcʨGOd d~Gw;.BrB;n[#1%KqhRu:}Y9蹛nO՟}h~EFͳէlYΑ~u8qJTɗ2 m%#vM ltcתqU{0Ҙ=ΙSlF#3eYfF*5WHm.F[nOx{=ڥ(}mGJ7N> (,3?/})8zsBAc/lB?١,jh(Jj7R^oוfObq,ӏ7ʙT8I<\دpHI^NH$x~O7KKΠJZ|:2vF8jF [ 2[`*%j{dMϽ2fZlypJ4Qב6W%7̿_jpx+E:mITV$-q3e7v,V{|MRdWL!!Qzb}՟IbP͗q# 2ԅͨO7V)jtyȬEWⲻ:$y]=Ʀە+NitOp`B3v,OO*pDԍ029((L J"gZ۸◽\Bhҗ[W-dKT &bL=x0!HW*!F,r-ļ|-RS!iF\WgnS;xn[pFvHwUZl5Ǟ@tOF.Q}%WNVRJLg܍(hĘ`]Z!m{-a4]nu)= FuoE*(хv.TCa![Q`} qk>4bϾ>S+Լq45#l<z!d:'}Ԅj>}VjP6R"pdG7* 5n/6Яv͡(gZ}q*^3tt@ky ]P=`)#sFMA(& LM6r[:ㄳ֠#uB'T.8]5BRtEd\+DPO}o jdYؓ'Sɲ"{ ^ ٳ4b?b_'P #vx箖^%gIJu Zvsǣ䡺*Bʹ2U57=x]S )a p%6ߤZZ~+oH0<*P*SJ Q]YToR7`R5DNh>7tZKi)àXRKQ}I5(kRATfpJ-G&ՌoZT%Q+~x.h(E<|jF Uh{-}ZJeRR:>ߤ:4#Yk)8-\aKQ}I5HykW(-S-isc*h Cs>M'ԍVYol(wiMњ@ \]dz yU9֜3/7'ݬBZ;$Bvᰓs3҉Q# HZ h;o;fzC67Lx$}u^_y( _>lecXq% _8rgW]z1R0&-]`$ugU} S=؀TT6|tur._`pI#PF;ĠLб *dhF5K4,3L0i?V2!c.4I 90^f[v1"ίz~@%WU+ 5`\h'\"L ,eY*V"CaXHʼ0*z*[)[n`Bުr\Zՙ3x@;N޺ϸ5_EĚBc"n|Us*"_{$<ٙw5/w 7[ӭK-J‘Rq77_?|6^~(0;dFh3Zm`[>Y3fh:LN?:TjEIĒuW+SꖧfZC 7w_&}4K_IAVIFêINxc(%=8 V4ʇE|x!?Y,7Ici~]TؾHj#+?L?7?:'x&B0dky??dBD2!Ca\&$`M uw;^:zpNf$Nh#.EQe0HJY3{3$oV~&f7wod^Ll*ʼnS}I5g?~*sc&R뺞W.ʺTMNݮ;Nպ] )rݢ|̈k UwfXu|Յ/_Fdߍ ]:OBnKa({`uR~.^Svٷqgͥ=ռrtzȕl،ge#'0-)-i6I7* .|8zxW.BH!)(SLRH_SHOv2jJ!)ccƳDԔW K Sbo t%m\5R%Pv7;Q||u_:NhNLryK@v X ]8]>(ˎ@u ]:B(]BwO{){:i]ɔg!e43[֝)A5N}P\B90ɏt JFt4BJ6*.jf5=6*٤iQ)!~ IJIӦo'"QQPޠj31))M$/F DP)IdiƤZYńZ1s,M0p#Ei }[* Eg^uM$'z{oQ_׷])<$]2ۻ%J.~~Zi[YMho_+Y0'<+a!DR ]f$c} /W|uxJ Nj` VwyB3.ϕiqdo\πMN 7*yr5U=%Z o=2yF2s7]*RPkwnxr.ۺ닛mJKҬGz>I.&_0fvuQvEio~B6rj]Y1>=rwtlf$zt&H(Xa%W>Yc٬A_bQ a&X-胲(!͉ ҤEɫCZڭABQA^ (OW?ǟF V-[lA[nb3+mqq i 5O+yX{WQXuS~o`2:NncbSg 4 b6 8:>|:0Zw]=_źjnj h#M‰S E5 PʜFZveFspjYiZ/\!IˏʔJ_zAym \YW= r%˛u|:Q y,v2?+EPA.&7#L@n )@"*}>$P KH此3)-|K|;X& At 4n:m^Q9X\K~ZˏxdTs4MI#ExP~!/miU*t"hqgRF$jѡ& XHBΊ8/¨]U]$ԪPЅtp1_'n/Bv{n|Hnenq [ُKӏ>1g#,?|~?B΍WI!f.Uhwu! Sǿ' )+|>MoNX8C =:*1fݞD%-!IZf%XyIpA%2sPp~(1R[E@H6\l0&*w>7b/ľs}";딪BSj)КS 稥<[7gcЂPU>O)۩{K/گFORzp4 Q{xJTNc%2""l]#LHih(.)BA1 [bg oŷQ:Te¡rQ)[ ͓џ?@*XK=+L4^dzcLRFCYy'yMk $ t< ]@Ά< coSqYQkNB-\'6: '.[`6[.4RNm%Q* :\ӌju1$3$1YI$j9݋Gl8/JtVDȊŁ3#XK Sc2!Ge~Wr7K*D[Js0w9I4,{0M8*(]ڧu7s"JC9!ъv r 8!O y%eiVn;m0j5Y&10'qX>K}3[be4B?[ΒWc6|_ j %Pם{0kv)!w#JkIaVΠA!w@)2#YPhF:YoĖLA⇇^nӽ6YWc:|^}l_ds忨se[Hb$GVErVw Pkp{^c>g?S;:CmM|27 >efNgZįWFB@4 x݂N-"w_? FQ-gA:N f&7:YQ3S S (CH `'DpN)aJ0$|ȏNsl,5:!lp 6bћKm&9 A6Q_9GLD}ҦYR8JEH'mRi5FE/R#&FIHoҦYs |~fTFU(&mnfMC~F=Ɣͭzݔ@j&87֗1Lf[}[rq7ٞM,f9Mܦn.MS 뫛ϛ.K@[elpuѶaΣ5_^WnY"^ۛ *eVh#ép3|cnMftٟwY.|LȔg{)}x߿{84䅫hNI@F%Y7F([,>6Oyxo-ukCC^F_Y7 ׋1ApKni[t;_߭ y*Z)S3 qsL)ì:jdT^8jի/jv>:^{}C3.l94ZRA [vg'HRc_7x]QPsN3MKȘ[Z`,Ζ㝾l'˕O޾1j藁O `pJډ!9j=3ZČiy"-T?)m =7/_]V0%5H)= .G_#E-@n]EL$ =3*؞ĎƳ1Y$%N,PrSB*FE춏 s’}P۰10(+6oaY^xEEKUSmϞ{e.5oXwS=W;-Qѯ mPn1v0Bp0Rv0bHn<9d 8uqRCBɥuN RLԻr*.c!8l~meK b%҉a&Хߛj[b ]e_ŻPYB)(ݭq̿ҥI0$-4Ź )]Qu{uG1\)ԦKgU:Cu_w% 4EgH0[/CШko.no DHCK:w؉Ƥ]ѽ"K0uzԍu^Wf‰$J,>,bRle8j9uΠ.9x[iCC^[o=cnN3Bpޗ#ܛuiАQ:E=}(|X |T'!mUgfݢkАuK)y续AWAw=&Xg`O_i1gQB.xg=Fl LJ49(r pنko5.>>XfЭiEN57ޑS{'^#lm]4ȩK{'F'%ෆ>v @=%Gν6:^(pz[Nxp*ukkb<1aS,s^" :q#Kt?vw~o"ǀeo{O݅/X;=ك&sGLHtI?|9E/idZZR#.#\K)goPK]|-?Ƨ")9j[K:τ`ϟe5 hAi9޳%31)͇PCSJl#HBz@ ph #:u0[m 8mL ns&=Agˤ*) LRG*, :4H8˜qsHhd(- )S{Vqjuݧ4 8!3Ξ(A&{Qtry#QeİNAĄ[1]&.3vߔ Q秋vX$V5/m,=ِSЌ`IL\hc:LSC*lj\"iI֢( D)iqSr d*[sZ*ryT wSo_NEDC{ pP)Eh& -(D1 1RBb A3UD"0ǜa)ciX % 2qJY1%RTrA 3qS!s*]P1ߕk=-:#Eom+Z+k뷍E=5ScfaLNaюNasǂB`(GjzOj:kW+@K {o-eAƘWqSqU  lz>Fh*]uSΛi3;?g걜Ž5 a {lf\K2mz1Z*(CC4C kUj߻R!,tdyGHH4`3wg -(P v5r> ||SE\fs׃Ctkc6<5VXj |a!qVص/,7\j.ݔν=Pb1i/(S˩uxdzJU4FXS;j`b1RgY.)ѺEVZ4䅫hN5ܷnX1XX |T'!mUgS<[USSH(o^>w-0+rf3hl J<%Sr^]hvq d.zsdIGqy wy%3{;~G;,y9cF=Sj=^n?ۺ-m2pYfAXKh&Mnj+Ю8أvq N # h,HPgf/}T& =mY0FΥ#O~A wWN}րgJ/<ZS!F Y_9. YrJ)6~ZS\(L.TKʺ{68Mq:z>+8ݻw' =ɍZd^O冎z'tZwjn6Okkc2ouDlee;|j1K[$\N횶l2~f]ZnweGNÜhv ŌGAJ!P3SFQ$|50&%ym&s7"lDt狘-_]F˿%$#2yIAͨ3{{8BG @­\l=LZlSͱң^o,R{Mi\G Rb"N$#:޿ΣhrUZ)}5ׅz][dr㾿~!&CSZd ƇhC>"1 ,S^(hZH0\4G èXH %zLA}~=~~K`TUyK]~ݿBƣHJ쟀3۫I1[[B?뾳yP͟adJ0&jܑɩ=LbCl?z 1jb|neVs?K$FػSJqs:M᭻)3 {*R. 1\7:]KT}*V!?a5٢ ٟܺHI^HYq|闛3 m嫹jfT 7'<1SE1Jojw=x'>zBwƝ;eػ.o(ݱ{qm κ.wۑ8v+9i{! ϋ򣻣=SEIj1 w籞~R:9wch܇BQ,)b x.&; /٧ +Ή@&EM-[ž.(;L~S#',m;w L5a\<5if?&˺dZב n"qDxz]<(aopz?O͚߿_j-.lγ. CUK0vR5tDr!a56`"nU]i? I$Z)n&K`$VRIzXWGʳCe.ۨj=Q)-)Hy~k=v]= V$iǼYs1]J~p;K@NkIQߣ;:5R֢vw KT0J]I'}zl}J g^֕nǩ#h>*7#wP`uҺS=ޕcՕt(lnF+9OܵF-'xp)JG"h:Pj`73` bz{n"ŃV]Fꏬj*!C/ڽSY`+,D}zL$$+OV@[ch̑ -PIzkRyj҇FiW#gXeQA&`V#}C#SsCZϬ:-[1dd*y]Jj==VXU  x ,Qi1Qːt CGdSabbW1YcےahXqjԯ+uz mhlY~^3A: yIm xF m+Vk. H8Px hOxO ,TX+f*b&N7-42WگԬ;"À޾zs9fr|֪KR"HS+vlf<~jP\`ەm#lnQLIĘax$19ḇ9+wRl?ry븞8YtBֶʱP1cM4A繞K&Vxy'{e-/,y#eaƬ$23:\(-V6ysQNͅA__tkyu NhF4U%ӌx(oG6=9vr(5;mdky!/2W333#`y`<ИsY^Ö)#c6l#X 1..QoxH3OkpI  >vPr͸ Ku$mp7cdǏ*އŨڽc0x밼[<V!z%vgbon?sUN(:,d[ K΅[ýx70^>^ |?f 8gHU.ZJ B՜/hV,Hx~-OBiHAҶIV)O:Y تO*a|>CCek⡔J)CP}]QϲFJT|;oYD42!}L:&q-U;EBZo>Μ2s)XG&ATH`e_lg桴R=,&)Q %XBIKYY}]n c+]jiV31;+A:Yif7ћdHr%~)E I|q:5kikK"85v`q7}j xt||xG1Nd}z}3K^mO5S;9M}q7i-8c?99$.,EgPӐRk/C+(큫QcwkUpCv\@Xp8ǜc. گ eǓ3FJ`Ez!*bO}ݢ~'iF7ƑG#]6Ñ;Xvۍ!mgo X#YEr`g:j p9޵u#"[l)N(X]4h~bYR$NC|uY@HGQlv5=ɮ?j6fsl6>pRZݽwb :0}bMz}$ii+=obFFQ| D 5歬J哣_7O 6v2\e .+/X*8ɉzt”Du+}N6oiz5@N,9s P+ ͍?>ڵM.l+}2+vL 'd 0}mX1r:A,#FI5wmcw#{ o)ynᴰwǥٰP/eE3M>SBHV2kQEǖ:<6=h@? ȶ.;A'r+Jt+r*ƖI?~\lpu[lt 1LtbВ#"DJqF}9Aja.%dYt{M;v|*j?i˲T{` gڪz3-ʃx`(W~p-ďmd_H} JPEPj袋Ǹ3B@R|SOž_|V[2^^<cN^ޜ(v-,V|; Ow1ɿ~zth=xT7 9E (\&mT!0ބq %m(6=$4S#2EGF}0R<ijL 6pQmXH )R$S2w<}c":Ju9ngaD9jy`'/#Qs$ac#4n(?{tҧ(>@^J:TwP>1yhіv 3KbESZf-Hu2T3&52SW06,ס4H,{42jx].}H#X9,Q)h ܶVjO5 d/9,Bpz/nۺWj1 ąUH诋:)qn//īT`Ws\j՜qoGj>v3Io~_&R*zlo#)lɐwWAx7Shܹ<h[ 2VʣMg-E*5'Pmuh oW!U4QgmZgwSR~m<ڍc fyǥǴQܱ<וR+o.J ;=j$F<$!4JB t;Kyx+5 "aԛIH2čjbWrCN0h^W / }_2y$.w hRt^nnkQ}7(ؼ'GK+hyt5\d~n(sGжE犊gE;Za2(KyWqkSI5,9T^*HP2h9R JCvQ1!DQ@W7ETo}e$emیrZUyG!2y#`S"YJkNlHzT)휇,!8؊kRDIu}Ҟ:ۢk+xK|ԒlNDѤe4@[ **r> .+(&1s)-KQfOp'u,fifM8ĞT <}ۃDv/b2P a,ѷq+MH>3ʗ)՚=4&n`+[h۩D !$ޔBTcQ 5 "xzՐzեR>%2 `Lynt?uCuY2[ɗԧ9ӕ*m4eDqAL >J/\[  Mea%kZ?;z˩?=Rcpq?hrzQ^1LdtYnN 2ínAe-zt|YM%p V=աK֊]\ϼn} t895:#%1nG h -*,lVXo7"R-/Ph^_$Ypj'$41DbuPD(@{{JR^c.*"]tѝ(ђSz}YYlka~/z}/r!}A{rp-*",M|?Na[)Y"ЂY]unpu.>;~? ܇[`k6߇b, ocJ1 tGhɶ!.O?R} 9؆t۷qp5ՓuֱZm@ik+<3`PF (s9a#gͩ+{٢HRK3}V>ܧK)Ǩ (g7)AwbhH;t)F.fK46KiUЅZv1 J< RŶgTJT {:\]7ո3.;Nqw 1 ,[sPG7d75(6w2Wg=2+bq:in m1ޠޤ}~ѬDomFGCz^@ ȈF=Zޖjp_ߞeVYO̙wQGcG'z1n<=k6DwdJf^B/8 scV,0 VJVjdk'ۭԾ/p1`YM|G.ڢ,0B5[QyҨQKWΉxe! [^44|Cyw#ZՒ^~fj6{#2Ve-9Q{+R%bEFjp| :2ؒ0H T%Wՠ a9f~2Kpq8cV"xV6MCm$_6$Ѯ^ul 2Z ul{܍ ڇ$Prc}|^l)S(/ ͸j\T^ G?. ,eϜk 1SEfȣRh{,ɋL)''TPFgZbgpGعhŝ'r }/DKl2seB{aɋ#e;;V~?5mF#&!Xlrѯje\*ͧ&0 JgIVch|2SFN~͝A$)$Ej1dqfSxJΉ`9ES#V&ENɜʆ Jrd_- Yf/Ə+TDd `2fh x5w}Sќ`-2dlI) l%K>ՕgΦrQ٣J{%ݨ^>,7ɟFD[<92 U\- в$ GpFJvOoP6ǔ6<$d'+EC)D *jEOp]ж+S`<ؾY=7ͮJ='Ԅ$I@D4]2rt3!KqTI?LdIyF؀"N3Έ,ZN3)/{M05  9yA'ZN84-ItBݭ,'Ia%YyVk1p920UrGQy҂(nxfLv6]os&)Y5]Ԡ\:(6A{Uq{k$߻@!e !sm3E"A5)NjZ}4¬Fk32 $ [URou =5m+knHw~ G <:%:67M  Gˬ̪<ܩ-1Ncop<f7mnr(ޟY*CsNDE H4.$uP!aVAtrn2R|F5ěǁHe8V/]ΉZB|tFFB f0+(4q8XKkZoN*r2u9gA#o| *P/Piz CN]12ijAeIPhңjL*>ɹPk[/=2Ȣd Pa: ^qYkU&O"fU{7(H6Rm4'f\Y4uo8EU~ĸ-]QI56gz@ZuՈ*E3dVn2ABKi{g34窛^)DHJ|~29=N=;F]šAB`,TQ3lU'11-j_I)#[IZ֌Tjݯ$EN Mכ`%6Ns9>$ %m/r=s @1ɽ,Y[5N _6&x\A[bD.oʹ-,s*y;o@Pe L6 ,/p] :!چJ"u.k/q_F{D>>r2`DR"p}p0A'mI\mqV!ZuƼ*@'ir@ rTt'aJ=M".1AD-9 laW9\YvUC4$64}j(pψP8+:6*f>˖;5Fmb3,_89F1E 4Ǵ/kB\`RKȥ|S lC ,ZF Rz+lw2;4@;DSC+4%v`9KDWzו彦:L~ "0͝%!I9 &C,'rxH ŕ)S ;~{fG} &zv4BPπM.;35ёѽP h]jW}m8L*I~i>!14hrxP [7Kkm=RH|kjfDwnvq &9;KWB;_+੧`sW2JZ>ܢqBɲD$ˊJuw&7n/wц]ڠϘ.)-&"M:rU#OJZ$Go(|dRkpp&PNjthjV)S}w>c3 TF۫kn- F}4rbj3,8b_JPtoוY>eAa=Gj#{K(BҙRWlk㴠M˯.ىUrBSԼEkIp}r1E.NbA;M,I,Ӽ$^zוTrՏoLHj"" p1FV,\˄Q%^X.|Qَ,WKBk0JiT蠼 !TЗV)D@Kq " J?*gZ/؈bN,z$3.5 9}nŎAtnoOk fmYu _Ͷ@6@H_mf5s)^ |#ALih4Y2[9r"ͻI5 w;ͻ1$/wkḧwS^P&WXVU6o4V{Ƨ>]Nh\rNG#cl[o|2wpFSe">@ ]6lωwJn97R^C֟!K\-SO,=l.-vY~Mݫ'* f53Py&GWIp{&Mdҫ'g EV|?^7]|{ޞgM.Ccr)py\_!57(xdƨ8/j*^h )vqzu{ե$G^p =I oޞ?^Ŭg$Ë|TpPŊg|[BT臣ڗkyUsw yU86Cxdzɗ" G^-M9r0˩¶kOU aހ"*?]:}/O|Ow,c07eokow~Wxe j*͎oCs1jǷ|aݎ:ĭZH>j z`4%Ɣip'J,Hitrczh3k5G}8j%*y\{_ByAm! SAߔHT6 T[%C$@Evf89UO>@~j|Qs?z 8R5J[ڣVĀ؎&JV< y.}n˒KgY29:,L(OoLh+a۪KZ;<}K|iiŽ.}~}ޏe`񽿸: ȀI6*'b|p4Qon}t䙠|pGhq'[> OuJvH#ZOumZ7yrS.Ly kz'VNz;.dJ9h7w\>vkAi&JMka5[EtAjR'AIm@tOeqz}mC: {nX٣Cv|f^ z3'S7Qsk}j,^ k%\цaӠXg4VR,֌'m8:mt|t'_*|R~Ò\Y@h&pg=Ѳsh|:9wo,|5ݚ+׭JmԧG}8j+Y{)I)5T)RuV$&у)87R:Ns` T˫jd>PNxl:XRDBC!xXєd^1Tv!Ggng7oAݎgOE]RPb Tj=S!b [-D5b-N<Ƹ_jN,l#bFOTz>|'$GWp/U!7,(i}bu$9'ݑԓZr}*M4fhZV4pm*Uo[j}s}A] eBV6-rioq4L_^ϋm~?6=_?D7r .BH`ƦTĸU;'XЕ0V2A<}dҏ.oQW\,bԝ?)Z9z8E̩9 m@=j_%yP>ج˃Қ7؄ny_FGt|+UG?\v~4烦zaNPJT邏@`[{qZU.W RR1?E9C geV8ɱmWwO8뷅!ki\EoEsEsEsEsP hxQDeڃ8~0SV ɳ[ F'QA-&" Cv} ,ե?FwcLqxDm% z3&eoV0jèQY'maghː d{ Sz0!DЧ;R_ǟ^Šx̢yW-6ܤ6 2}۶m'?z.K}PB%{#RҌ3̐?Pyysk>qK](R'd3){Ԁ#LhF;Ʉe= [s2qAf 3z=2ɄN"-\YPPP? S. ,$V49hv'-+ƞxjLɪ)w1W4@%AJ{M0^gZLj.5HOV 0 •S\0;N@lu F6Ji+lGbBoHVNRT 5Z(e {rYDz(Z8@(AdVo# K'h£KTV1 EMx,%S?B{vb,~HϔIĄ0*\|o8[NOg0tf~H-Ǿn/]Vﳞ&O۟٬u~y8GJt'g Bb$:{-KL@^~Y|*OB&{*"ܘQ)K(ЄԔ )hVM%e.ʬ'bh<gv=4@&Y:-FzF1"XRel/%L)f&w\-I3.r<`h~ SGӦF1r26yNTfvy .3;9Nq?io?nB$^r }ߎ'l1[L$:lO hHE ,NIJ}C[<Z@2LD% L3<̘-ُԫ9ݣ[c==O-l$+Pv%FYҫ`jsIh H.xʖ#f2h~Vz) {O)* 0O{Ӽމ(];muƸv`wΓaUNuf6 'hg)vX7'~5ƴa0VM.9]3yRQBާ'<~S햇c->'ADzd%,:&ȶѲz>p΂qH#gP~JѮ-5@L0&Rcd!@^Z#l|@^,/*1zK͉p:{R@l s63# 9oʡG t7=>8M>j|eٳG>Tvoܺaxu~ԮN{S3Sg| X9ܜ$WQ1|3^v@[ |B'!mTyfC4*4}Ź݄C7avO~#$Mh-23W17wf_f N?չ닯7?/I`ou~ӻ77/햂]=3JA+ݬQ]N^o}&KDS±0H|RK 0&<[B]&ô 0 6G3S"pf8FJ |۷h!2=z'mӣ{zC<*P!ϋOf$}ܨUX\q5X-5VS4‡,F`F` X55[ kZ]jGIlWpk&ghʼnT8mqQhsHFCnve% Sg >o8rƀ,H3"w־SBzݕZbwtY" Mc`,.Ԡ:G鲣Ɍ*M]$݉뎤S/?8~U&>wfeucf6Qk}gۺQ\ޠ(~w?PD*p% l_-nRK ~.Y}$S}6cb)I))`$r9hI)JDN>RRN@qqR\p2)Bx)s!f*0/[tXޭ5HGXWɬIu^p&Y)ՊՕ\9+Eh;Nٱb_bdу o_#Xܦ{9Յ91N+Tܯ+Yh +YL^9g0Ohӓ {!? y>dl mb҅'?SDDW}jGeyQ*JTR`bJYYV(eeS׺( @hݲfK/={A0֔P<qQP^¡D;S%<׍lUb]Ѧ FJ^弮zφLgY`CBPBѼв ;&. ƌ(iqЪ)h׆ܳyc&5RLˢ?˂L@\|.KC]|p3iy<%zOjA;7-, JQ -h͊*fYC"W s exQT 9ڄ@#8i>2UJa] /϶zŝeԥˎ DeWeIc5fBR(f=˶k{jD#nc52fI8_"K\vXF+xz8"3{Uʸ\jyj@WmCk~cEġVOw6_ b0i7"|˻G@7o9W_.mg> W9+M]y*Ԃ!DB1 LAoÝ2B(OaUK>- RRsپ=(J5 xOH! xƔT*R 6Օg$8ъ;.; ?13!Mp&8G .]ַ`U ]7Mmᒼ ,[-7}fOmGb$o PO:ui0  n7^n?EM- h-tUF>_t>5?O{0h@͵ZG^ ӍR+ gItˮԔIMIfɏQ",pM`Pໄ¢T'WtOgή̔cNHkVoӽH"=a)"䕇hTL!\ )I_#mŨ .aE5ڗ5A)d8a-^lnS":X$Cg+p-E)J| >gY9zݕo'-фW4YD)]+}EG)og[ cR?֭}G)%^%}f%Du(=-+5(]tRu),gKOKJ .9J@&YD)(s.1J۳'rQ=G颣q(eŮi]b;G{֥TxHgܳ.mRyd("ebOWM]V~W[*ҒMw뛯 4Αo9VHq 24jOX,Jp7dS;9 .A2r6%螛?NR9Oǟ:ai]"&ŌM]~g،BE|5^[` ß~vj&t-,XK7oT 1HJiE~3hQ}eq6,fLG3NQ0d#Ү޶\m.BJݘlRJXɢB,k(<_u^4䪚"'׺Fg˭nH^ǘEHNc3d&H3?&8vvעQy*L9t2+U֨M1EB%<3!*1$ $HW6,_1M7(+yAHNG[ 11885ms (19:18:40.367) Feb 19 19:18:40 crc kubenswrapper[4722]: Trace[2129662400]: [11.885945232s] [11.885945232s] END Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.367628 4722 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.367747 4722 trace.go:236] Trace[1037160239]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 19:18:28.648) (total time: 11718ms): Feb 19 19:18:40 crc kubenswrapper[4722]: Trace[1037160239]: ---"Objects listed" error: 11718ms (19:18:40.367) Feb 19 19:18:40 crc kubenswrapper[4722]: Trace[1037160239]: [11.718727728s] [11.718727728s] END Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.367764 4722 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.368798 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.380973 4722 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.404125 4722 csr.go:261] certificate signing request csr-jw779 is approved, waiting to be issued Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.404912 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54328->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.404918 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54336->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.404972 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54328->192.168.126.11:17697: read: connection reset by peer" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.405011 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54336->192.168.126.11:17697: read: connection reset by peer" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.414790 4722 csr.go:257] certificate signing request csr-jw779 is issued Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466621 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466681 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466707 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466729 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466752 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466775 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466797 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466817 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466838 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466861 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466883 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466951 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.466975 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467033 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467054 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467073 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467091 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467093 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467114 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467141 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467183 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467205 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467227 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467254 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467274 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467293 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467312 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467330 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467349 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467369 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467393 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467414 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467439 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467460 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467479 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467500 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467523 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467592 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467613 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467636 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467685 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467707 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467727 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467747 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467767 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467789 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467810 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467831 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467853 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467873 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467898 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467920 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467941 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467963 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467983 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468004 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467216 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468027 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467348 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467362 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468050 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467374 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467405 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467649 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467755 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467689 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467730 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467736 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467764 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467925 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468210 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468229 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468223 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467918 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467969 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.467990 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468009 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468285 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468844 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468887 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468911 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468946 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468982 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469001 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469021 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469045 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469064 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469081 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469103 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469124 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469169 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469204 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469226 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468937 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.474805 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.468958 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469834 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469858 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.469891 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.470046 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.470620 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.470829 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.471371 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.471457 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.471785 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.471990 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.472554 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.472662 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.472792 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.473966 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.474091 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.474100 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.474290 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475000 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.474679 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.474838 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475143 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475228 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475265 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475292 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475324 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475354 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475383 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475560 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475586 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475610 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475632 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475658 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475684 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475708 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475737 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475762 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475788 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475144 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475814 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475191 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475273 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475845 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475688 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475799 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475871 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475893 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475922 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475945 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475967 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.475995 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.476474 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.476774 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.476791 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.477296 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.477811 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.477949 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478130 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478558 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478590 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478618 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478660 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478687 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478720 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478744 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478772 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478801 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478823 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478843 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478865 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478888 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478912 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478939 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478965 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478988 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.477803 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479092 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479099 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478069 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479182 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478103 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478367 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479213 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478414 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479256 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479289 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479415 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479444 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479474 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479500 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479528 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479566 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479592 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478417 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478559 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478565 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.480344 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.480398 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.480368 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.480408 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478792 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478822 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478912 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.480437 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478983 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479442 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.480514 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479849 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.480546 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479954 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.480192 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.478656 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.480413 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.479284 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.480957 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.481590 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.481629 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.481647 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.481738 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.482020 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.482122 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.482198 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.482567 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.482694 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.482776 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.482790 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.482829 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.482869 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.483207 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.483908 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.484002 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.483755 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.484446 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.484793 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.484939 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.484987 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485163 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485238 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485293 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485318 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485346 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485369 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485390 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485413 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485435 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485460 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485482 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485505 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485527 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485548 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485572 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485597 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485620 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485644 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485666 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485710 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485735 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485757 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485778 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485801 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485822 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485845 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485866 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485888 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485912 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485936 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485958 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.485980 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486002 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486025 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486047 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486070 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486099 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486120 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486142 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486181 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486205 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486227 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486249 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486326 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486346 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486363 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486380 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486411 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486428 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486445 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486461 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486477 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486495 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486512 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486528 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486543 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486560 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486577 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486592 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486609 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486625 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486640 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486655 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486672 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486711 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486731 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486750 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486771 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486789 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486808 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486826 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486846 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486864 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486880 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486897 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486913 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486929 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.486945 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487003 4722 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487014 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487023 4722 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487033 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487041 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487050 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487059 4722 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487068 4722 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487077 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487086 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487095 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487103 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487113 4722 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487121 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487130 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487139 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487165 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487178 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487187 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487197 4722 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487206 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487215 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487223 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487231 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487240 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487249 4722 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487257 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487268 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487277 4722 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487288 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487297 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487306 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487315 4722 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487324 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487333 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487342 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487351 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487360 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487368 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487377 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487385 4722 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487394 4722 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487403 4722 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487411 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487420 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487429 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487437 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487445 4722 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487453 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487463 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487471 4722 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487480 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487489 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487498 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487507 4722 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487516 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487525 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487536 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487548 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487560 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487572 4722 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487582 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487591 4722 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487600 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487609 4722 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487621 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487633 4722 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487645 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487657 4722 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487667 4722 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487652 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487676 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487731 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487746 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487765 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487777 4722 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487795 4722 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487807 4722 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487817 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487831 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487846 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487855 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487865 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487875 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487885 4722 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.487895 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488350 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488363 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488373 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488382 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488392 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488401 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488410 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488422 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488432 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488442 4722 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488452 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488462 4722 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488497 4722 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488509 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488518 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488163 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488176 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488223 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488499 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488621 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488908 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.488937 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.489232 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.489428 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.489537 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.489794 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.489942 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.490177 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.490661 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.490938 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491001 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491074 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491192 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491226 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491325 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491454 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491474 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491591 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491704 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491777 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491792 4722 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491874 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.491938 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.492175 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.492249 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.492550 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.493314 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.493436 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.493564 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.493613 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.493842 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:18:40.99382177 +0000 UTC m=+20.606172104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.494265 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.494437 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.494493 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:40.9944773 +0000 UTC m=+20.606827624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.494520 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.494571 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.495215 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.495280 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.495284 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.495304 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:40.995282345 +0000 UTC m=+20.607632669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.495595 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.495828 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.495862 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.496064 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.496078 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.496205 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.496243 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.496405 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.496500 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.496871 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.497552 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.497776 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.498291 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.498328 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.498580 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.498791 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.499043 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.499703 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.502690 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.502865 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.502889 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.503205 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.503215 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.503362 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.503506 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.503498 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.503607 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.503632 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.503810 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.504393 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.504422 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.504440 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.504508 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:41.004488493 +0000 UTC m=+20.616838907 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.504204 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.504246 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.504354 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.504699 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.504763 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.505234 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.505376 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.505488 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.506015 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.509240 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.510767 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.510440 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.510858 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.509411 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.509465 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.509490 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.509838 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.510945 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:41.010921974 +0000 UTC m=+20.623272308 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.511852 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.512038 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.512835 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.512913 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.512976 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.518280 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.518588 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.518694 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.519283 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.519179 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.519616 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.519403 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.529497 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.534220 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.534914 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.544659 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.561173 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.561683 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.567676 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.590527 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.590784 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.590946 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591040 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.590813 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591123 4722 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591207 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591245 4722 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591257 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591267 4722 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.590781 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591309 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591318 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591328 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591336 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591345 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591354 4722 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591363 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591371 4722 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591380 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591389 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591398 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591407 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591415 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591423 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591431 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591439 4722 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591448 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591456 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591465 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591474 4722 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591482 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591490 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591499 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591508 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591517 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591526 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591536 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591546 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591558 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591567 4722 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591575 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591584 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591592 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591601 4722 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591610 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591618 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591626 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591635 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591643 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591652 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591661 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591670 4722 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591678 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591687 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591695 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591704 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591712 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591721 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591729 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591737 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591746 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591754 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591761 4722 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591770 4722 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591778 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591785 4722 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591793 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591801 4722 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591809 4722 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591817 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591824 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591832 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591840 4722 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591848 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591856 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591863 4722 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591871 4722 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591879 4722 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591886 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591894 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591902 4722 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591909 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591917 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591925 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591932 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591941 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591957 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591965 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591973 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591981 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591990 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.591998 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.592007 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.592015 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.592023 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.592030 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.592061 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.637377 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.650799 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.661137 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.663383 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-8c0c1d137e661aea8967457f6fe359bfc840933c490b2c276607877dfadbe624 WatchSource:0}: Error finding container 8c0c1d137e661aea8967457f6fe359bfc840933c490b2c276607877dfadbe624: Status 404 returned error can't find the container with id 8c0c1d137e661aea8967457f6fe359bfc840933c490b2c276607877dfadbe624 Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.674949 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-37b9ff0317d4cde4f0c738bd82866d4c7cfbf6c9f23d15c161d29ec07e70a24a WatchSource:0}: Error finding container 37b9ff0317d4cde4f0c738bd82866d4c7cfbf6c9f23d15c161d29ec07e70a24a: Status 404 returned error can't find the container with id 37b9ff0317d4cde4f0c738bd82866d4c7cfbf6c9f23d15c161d29ec07e70a24a Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.797244 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-xq6bx"] Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.797939 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.799953 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.800450 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.800739 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.801732 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.817183 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.838476 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.849897 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.864221 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.873416 4722 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.873534 4722 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-network-node-identity"/"network-node-identity-cert": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.873565 4722 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.873585 4722 reflector.go:484] object-"openshift-image-registry"/"node-ca-dockercfg-4777p": watch of *v1.Secret ended with: very short watch: object-"openshift-image-registry"/"node-ca-dockercfg-4777p": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.873903 4722 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.873978 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-node-identity/pods/network-node-identity-vrzqb/status\": read tcp 38.102.83.195:44978->38.102.83.195:6443: use of closed network connection" Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874167 4722 reflector.go:484] object-"openshift-image-registry"/"image-registry-certificates": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"image-registry-certificates": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874195 4722 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874347 4722 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874372 4722 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874389 4722 reflector.go:484] object-"openshift-image-registry"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874404 4722 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874421 4722 reflector.go:484] object-"openshift-image-registry"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.874446 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/events\": read tcp 38.102.83.195:44978->38.102.83.195:6443: use of closed network connection" event="&Event{ObjectMeta:{kube-controller-manager-crc.1895bbf81d19a547 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 19:18:22.754661703 +0000 UTC m=+2.367012027,LastTimestamp:2026-02-19 19:18:22.754661703 +0000 UTC m=+2.367012027,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874530 4722 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"ovnkube-identity-cm": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874549 4722 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"iptables-alerter-script": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874564 4722 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-network-operator"/"metrics-tls": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874579 4722 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: W0219 19:18:40.874595 4722 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.886125 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.891985 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.893216 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fad04006-ed10-4444-ae85-9c0a31a95466-serviceca\") pod \"node-ca-xq6bx\" (UID: \"fad04006-ed10-4444-ae85-9c0a31a95466\") " pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.893262 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsjws\" (UniqueName: \"kubernetes.io/projected/fad04006-ed10-4444-ae85-9c0a31a95466-kube-api-access-vsjws\") pod \"node-ca-xq6bx\" (UID: \"fad04006-ed10-4444-ae85-9c0a31a95466\") " pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.893296 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fad04006-ed10-4444-ae85-9c0a31a95466-host\") pod \"node-ca-xq6bx\" (UID: \"fad04006-ed10-4444-ae85-9c0a31a95466\") " pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.993959 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.994073 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fad04006-ed10-4444-ae85-9c0a31a95466-serviceca\") pod \"node-ca-xq6bx\" (UID: \"fad04006-ed10-4444-ae85-9c0a31a95466\") " pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:40 crc kubenswrapper[4722]: E0219 19:18:40.994101 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:18:41.994080399 +0000 UTC m=+21.606430723 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.994125 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsjws\" (UniqueName: \"kubernetes.io/projected/fad04006-ed10-4444-ae85-9c0a31a95466-kube-api-access-vsjws\") pod \"node-ca-xq6bx\" (UID: \"fad04006-ed10-4444-ae85-9c0a31a95466\") " pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.994167 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fad04006-ed10-4444-ae85-9c0a31a95466-host\") pod \"node-ca-xq6bx\" (UID: \"fad04006-ed10-4444-ae85-9c0a31a95466\") " pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.994249 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fad04006-ed10-4444-ae85-9c0a31a95466-host\") pod \"node-ca-xq6bx\" (UID: \"fad04006-ed10-4444-ae85-9c0a31a95466\") " pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:40 crc kubenswrapper[4722]: I0219 19:18:40.995263 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fad04006-ed10-4444-ae85-9c0a31a95466-serviceca\") pod \"node-ca-xq6bx\" (UID: \"fad04006-ed10-4444-ae85-9c0a31a95466\") " pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.013353 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsjws\" (UniqueName: \"kubernetes.io/projected/fad04006-ed10-4444-ae85-9c0a31a95466-kube-api-access-vsjws\") pod \"node-ca-xq6bx\" (UID: \"fad04006-ed10-4444-ae85-9c0a31a95466\") " pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.028132 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 16:41:26.104576767 +0000 UTC Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.079862 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.080579 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.082318 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.083133 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.084547 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.085302 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.086086 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.087952 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.088841 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.090353 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.091249 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.092902 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.093623 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.094042 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.094344 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.094569 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.094610 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.094630 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.094649 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094711 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094770 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:42.094749274 +0000 UTC m=+21.707099598 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094817 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094846 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094849 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094863 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094854 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094918 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:42.094902979 +0000 UTC m=+21.707253313 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094927 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094940 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:42.09493023 +0000 UTC m=+21.707280564 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.094945 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:41 crc kubenswrapper[4722]: E0219 19:18:41.095025 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:42.094987772 +0000 UTC m=+21.707338106 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.095664 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.096406 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.097683 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.098210 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.098993 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.100520 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.101199 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.102541 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.103121 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.104548 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.104957 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.105625 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.106707 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.107189 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.108210 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.108660 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.109646 4722 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.109752 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.111329 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.112218 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.112626 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.114182 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.114808 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.115728 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.116357 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.117387 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.117855 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.118519 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xq6bx" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.118946 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.119635 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.120643 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.121146 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.122049 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.122709 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.123878 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.124393 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.125282 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.125712 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.126596 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.127181 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.127631 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.131532 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.148377 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.179172 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.191297 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.198849 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.205206 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.205826 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.207312 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012" exitCode=255 Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.207367 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012"} Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.208805 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"37b9ff0317d4cde4f0c738bd82866d4c7cfbf6c9f23d15c161d29ec07e70a24a"} Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.210489 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58"} Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.210521 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"4aa51c6299b68bb6f01f25018528b68e6381f4f60652d3b2252331cb08a10c52"} Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.211723 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xq6bx" event={"ID":"fad04006-ed10-4444-ae85-9c0a31a95466","Type":"ContainerStarted","Data":"7769573710b422fe1497d25bafd7920f3215eb3d9eb32511a2440dc0bd1c2c91"} Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.213878 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939"} Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.213908 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc"} Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.213921 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8c0c1d137e661aea8967457f6fe359bfc840933c490b2c276607877dfadbe624"} Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.216621 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.225768 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.233522 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-lwpgw"] Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.233796 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lwpgw" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.233841 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-w8zrl"] Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.234312 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.235904 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.235965 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.236249 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.236263 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.236670 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.238919 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.238997 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.239123 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.242129 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.257039 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.267115 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.278081 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.286241 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.299671 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.308021 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.315254 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.329413 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.338129 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.346430 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.356620 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.365336 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.374118 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.383325 4722 scope.go:117] "RemoveContainer" containerID="e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.384113 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.396997 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9734f69-4441-4618-849c-54e0aca328e4-hosts-file\") pod \"node-resolver-lwpgw\" (UID: \"e9734f69-4441-4618-849c-54e0aca328e4\") " pod="openshift-dns/node-resolver-lwpgw" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.397073 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbv9c\" (UniqueName: \"kubernetes.io/projected/e9734f69-4441-4618-849c-54e0aca328e4-kube-api-access-bbv9c\") pod \"node-resolver-lwpgw\" (UID: \"e9734f69-4441-4618-849c-54e0aca328e4\") " pod="openshift-dns/node-resolver-lwpgw" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.397110 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b265ff4c-d096-4b39-8032-fe0b84354832-mcd-auth-proxy-config\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.397136 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkfk8\" (UniqueName: \"kubernetes.io/projected/b265ff4c-d096-4b39-8032-fe0b84354832-kube-api-access-fkfk8\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.397182 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b265ff4c-d096-4b39-8032-fe0b84354832-rootfs\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.397196 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b265ff4c-d096-4b39-8032-fe0b84354832-proxy-tls\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.415973 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-19 19:13:40 +0000 UTC, rotation deadline is 2027-01-07 13:29:13.529073629 +0000 UTC Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.416035 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7722h10m32.113040436s for next certificate rotation Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.451609 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.461871 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.475377 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.493200 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.498097 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkfk8\" (UniqueName: \"kubernetes.io/projected/b265ff4c-d096-4b39-8032-fe0b84354832-kube-api-access-fkfk8\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.498174 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b265ff4c-d096-4b39-8032-fe0b84354832-rootfs\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.498199 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b265ff4c-d096-4b39-8032-fe0b84354832-proxy-tls\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.498257 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9734f69-4441-4618-849c-54e0aca328e4-hosts-file\") pod \"node-resolver-lwpgw\" (UID: \"e9734f69-4441-4618-849c-54e0aca328e4\") " pod="openshift-dns/node-resolver-lwpgw" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.498278 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbv9c\" (UniqueName: \"kubernetes.io/projected/e9734f69-4441-4618-849c-54e0aca328e4-kube-api-access-bbv9c\") pod \"node-resolver-lwpgw\" (UID: \"e9734f69-4441-4618-849c-54e0aca328e4\") " pod="openshift-dns/node-resolver-lwpgw" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.498328 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b265ff4c-d096-4b39-8032-fe0b84354832-rootfs\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.498377 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9734f69-4441-4618-849c-54e0aca328e4-hosts-file\") pod \"node-resolver-lwpgw\" (UID: \"e9734f69-4441-4618-849c-54e0aca328e4\") " pod="openshift-dns/node-resolver-lwpgw" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.498503 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b265ff4c-d096-4b39-8032-fe0b84354832-mcd-auth-proxy-config\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.499251 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b265ff4c-d096-4b39-8032-fe0b84354832-mcd-auth-proxy-config\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.502672 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b265ff4c-d096-4b39-8032-fe0b84354832-proxy-tls\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.512957 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.515248 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbv9c\" (UniqueName: \"kubernetes.io/projected/e9734f69-4441-4618-849c-54e0aca328e4-kube-api-access-bbv9c\") pod \"node-resolver-lwpgw\" (UID: \"e9734f69-4441-4618-849c-54e0aca328e4\") " pod="openshift-dns/node-resolver-lwpgw" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.521004 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkfk8\" (UniqueName: \"kubernetes.io/projected/b265ff4c-d096-4b39-8032-fe0b84354832-kube-api-access-fkfk8\") pod \"machine-config-daemon-w8zrl\" (UID: \"b265ff4c-d096-4b39-8032-fe0b84354832\") " pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.530595 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.548352 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lwpgw" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.548731 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.553529 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:18:41 crc kubenswrapper[4722]: W0219 19:18:41.568013 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9734f69_4441_4618_849c_54e0aca328e4.slice/crio-5b381afc5f721e8245adf251645ff7c0a743cb4e8446354064cc3fac12d762de WatchSource:0}: Error finding container 5b381afc5f721e8245adf251645ff7c0a743cb4e8446354064cc3fac12d762de: Status 404 returned error can't find the container with id 5b381afc5f721e8245adf251645ff7c0a743cb4e8446354064cc3fac12d762de Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.569334 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: W0219 19:18:41.575049 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb265ff4c_d096_4b39_8032_fe0b84354832.slice/crio-39d83f73c898f095c09768da8c6f5deb56696c2e9e0d54c907be0c3008d68397 WatchSource:0}: Error finding container 39d83f73c898f095c09768da8c6f5deb56696c2e9e0d54c907be0c3008d68397: Status 404 returned error can't find the container with id 39d83f73c898f095c09768da8c6f5deb56696c2e9e0d54c907be0c3008d68397 Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.586612 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.607981 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.608974 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-7g5gg"] Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.609663 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.612421 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.612485 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.612542 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.612693 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.612810 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.622199 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-jnvgg"] Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.622310 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.622561 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.623740 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.624660 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.644597 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.662746 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.671739 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.685755 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.698842 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700100 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-var-lib-kubelet\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700124 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75c45\" (UniqueName: \"kubernetes.io/projected/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-kube-api-access-75c45\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700187 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-socket-dir-parent\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700203 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-run-k8s-cni-cncf-io\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700218 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-cni-binary-copy\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700233 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700253 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-cni-binary-copy\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700275 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-etc-kubernetes\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700290 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-system-cni-dir\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700315 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n829t\" (UniqueName: \"kubernetes.io/projected/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-kube-api-access-n829t\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700333 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700348 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-os-release\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700362 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-os-release\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700377 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-var-lib-cni-multus\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700392 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-daemon-config\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700415 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-cnibin\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700474 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-cnibin\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700516 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-run-netns\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700536 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-run-multus-certs\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700567 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-hostroot\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700584 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-system-cni-dir\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700623 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-cni-dir\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700638 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-var-lib-cni-bin\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.700652 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-conf-dir\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.709125 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.712895 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.721977 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.734739 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.744811 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.745924 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.754242 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.765828 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.776505 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.787422 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.797910 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801081 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-cni-dir\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801112 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-var-lib-cni-bin\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801130 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-conf-dir\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801163 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-var-lib-kubelet\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801178 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75c45\" (UniqueName: \"kubernetes.io/projected/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-kube-api-access-75c45\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801193 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-socket-dir-parent\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801210 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-run-k8s-cni-cncf-io\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801217 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-var-lib-cni-bin\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801224 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-cni-binary-copy\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801250 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-var-lib-kubelet\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801275 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801299 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-conf-dir\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801313 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-cni-binary-copy\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801357 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-etc-kubernetes\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801395 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-system-cni-dir\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801443 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n829t\" (UniqueName: \"kubernetes.io/projected/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-kube-api-access-n829t\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801457 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-socket-dir-parent\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801459 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-cni-dir\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801532 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-os-release\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801567 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-var-lib-cni-multus\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801596 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-daemon-config\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801461 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-etc-kubernetes\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801626 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-cnibin\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801657 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-os-release\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801664 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-var-lib-cni-multus\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801688 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-cnibin\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801745 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-cnibin\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801769 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-run-netns\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801749 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-run-netns\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801775 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-cnibin\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801807 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-cni-binary-copy\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801815 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-run-multus-certs\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801799 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-run-multus-certs\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801828 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-os-release\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801838 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-os-release\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801865 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-host-run-k8s-cni-cncf-io\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801888 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-hostroot\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801912 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-system-cni-dir\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801938 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-hostroot\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.801969 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-system-cni-dir\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.802078 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-system-cni-dir\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.802125 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.802210 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.802229 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-multus-daemon-config\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.802367 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-cni-binary-copy\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.810503 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.818474 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75c45\" (UniqueName: \"kubernetes.io/projected/285e13d6-a3ce-4bc2-9be4-bb6db3593a0d-kube-api-access-75c45\") pod \"multus-additional-cni-plugins-7g5gg\" (UID: \"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\") " pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.822397 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n829t\" (UniqueName: \"kubernetes.io/projected/7a80fcd7-8ac4-4e82-8f14-93d225898bb5-kube-api-access-n829t\") pod \"multus-jnvgg\" (UID: \"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\") " pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.823461 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.836056 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.849468 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.861233 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.873365 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.883565 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.892796 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.910363 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.939747 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.949931 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jnvgg" Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.959794 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 19:18:41 crc kubenswrapper[4722]: W0219 19:18:41.976443 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a80fcd7_8ac4_4e82_8f14_93d225898bb5.slice/crio-067d7f3c981aee6e7b6dafc5b12acd68ad21ccaec6e506e0c20216208a280841 WatchSource:0}: Error finding container 067d7f3c981aee6e7b6dafc5b12acd68ad21ccaec6e506e0c20216208a280841: Status 404 returned error can't find the container with id 067d7f3c981aee6e7b6dafc5b12acd68ad21ccaec6e506e0c20216208a280841 Feb 19 19:18:41 crc kubenswrapper[4722]: I0219 19:18:41.994666 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.003188 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.003413 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:18:44.003379084 +0000 UTC m=+23.615729418 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.005755 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vsfln"] Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.007113 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.010135 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.011029 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.014365 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.027327 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.029920 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 06:48:21.448865167 +0000 UTC Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.042682 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.060300 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.071029 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.071129 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.071277 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.071384 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.071456 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.071554 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104009 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104643 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-netd\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104697 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-systemd-units\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104726 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104751 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-var-lib-openvswitch\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104772 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-systemd\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104795 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104814 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-log-socket\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104833 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-script-lib\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104855 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104879 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-slash\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104897 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-openvswitch\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104911 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-ovn-kubernetes\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104929 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-config\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104946 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovn-node-metrics-cert\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104966 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-etc-openvswitch\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.104987 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-ovn\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.105010 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-netns\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.105030 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-kubelet\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.105050 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.105076 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.105098 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-node-log\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.105121 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-bin\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.105139 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-env-overrides\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.105204 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjr2p\" (UniqueName: \"kubernetes.io/projected/5eb7c404-f96e-43a7-b20f-b45d856c75a5-kube-api-access-zjr2p\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105447 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105482 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105500 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105534 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105625 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105639 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105558 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:44.105539185 +0000 UTC m=+23.717889509 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105646 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105685 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:44.105657549 +0000 UTC m=+23.718008083 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105704 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105773 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:44.105740512 +0000 UTC m=+23.718090836 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:42 crc kubenswrapper[4722]: E0219 19:18:42.105860 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:44.105844195 +0000 UTC m=+23.718194519 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.119505 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.141230 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.171244 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.180565 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206463 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-node-log\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206505 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-bin\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206529 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-env-overrides\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206552 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-node-log\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206560 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjr2p\" (UniqueName: \"kubernetes.io/projected/5eb7c404-f96e-43a7-b20f-b45d856c75a5-kube-api-access-zjr2p\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206564 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-bin\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206584 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-netd\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206609 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-systemd-units\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206641 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-var-lib-openvswitch\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206664 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-systemd\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206694 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-log-socket\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206734 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-script-lib\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206755 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-netd\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206757 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-openvswitch\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-ovn-kubernetes\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206791 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-openvswitch\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206802 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-config\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206827 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-slash\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206830 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-var-lib-openvswitch\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206841 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovn-node-metrics-cert\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206856 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-etc-openvswitch\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206861 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-systemd\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206869 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-ovn\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206885 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-kubelet\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206890 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-log-socket\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206898 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-netns\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206912 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206959 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.207018 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-ovn-kubernetes\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.207066 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-env-overrides\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.207106 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-etc-openvswitch\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.207129 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-slash\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.207476 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-script-lib\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.207480 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-config\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.206734 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-systemd-units\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.207515 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-netns\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.207513 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-kubelet\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.207517 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-ovn\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.210052 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovn-node-metrics-cert\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.218176 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" event={"ID":"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d","Type":"ContainerStarted","Data":"c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.218223 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" event={"ID":"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d","Type":"ContainerStarted","Data":"ffb06d45fa7531890253050a0ce71077ac6c26811651a46a6310b828e8171528"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.219383 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.220113 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.221696 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.222251 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.223787 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.223815 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.223826 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"39d83f73c898f095c09768da8c6f5deb56696c2e9e0d54c907be0c3008d68397"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.224816 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lwpgw" event={"ID":"e9734f69-4441-4618-849c-54e0aca328e4","Type":"ContainerStarted","Data":"d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.224840 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lwpgw" event={"ID":"e9734f69-4441-4618-849c-54e0aca328e4","Type":"ContainerStarted","Data":"5b381afc5f721e8245adf251645ff7c0a743cb4e8446354064cc3fac12d762de"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.226466 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xq6bx" event={"ID":"fad04006-ed10-4444-ae85-9c0a31a95466","Type":"ContainerStarted","Data":"ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.227680 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jnvgg" event={"ID":"7a80fcd7-8ac4-4e82-8f14-93d225898bb5","Type":"ContainerStarted","Data":"5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.227709 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jnvgg" event={"ID":"7a80fcd7-8ac4-4e82-8f14-93d225898bb5","Type":"ContainerStarted","Data":"067d7f3c981aee6e7b6dafc5b12acd68ad21ccaec6e506e0c20216208a280841"} Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.228001 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.246314 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.273732 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjr2p\" (UniqueName: \"kubernetes.io/projected/5eb7c404-f96e-43a7-b20f-b45d856c75a5-kube-api-access-zjr2p\") pod \"ovnkube-node-vsfln\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.300697 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.322252 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.329498 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: W0219 19:18:42.335615 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5eb7c404_f96e_43a7_b20f_b45d856c75a5.slice/crio-80457ad8997939dc8e0991d051b5ca049affdba095f79270711bc1380ced8db4 WatchSource:0}: Error finding container 80457ad8997939dc8e0991d051b5ca049affdba095f79270711bc1380ced8db4: Status 404 returned error can't find the container with id 80457ad8997939dc8e0991d051b5ca049affdba095f79270711bc1380ced8db4 Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.340737 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.380337 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.408254 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.449304 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.480051 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.513421 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.520577 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.568627 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.606451 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.660937 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.690983 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.735857 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.769239 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.808294 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.847992 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.905325 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.928243 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:42 crc kubenswrapper[4722]: I0219 19:18:42.969685 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:42Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.021073 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.030696 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 07:45:39.951751845 +0000 UTC Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.060708 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.090905 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.136091 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.173826 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.208414 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.231900 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e"} Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.233021 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1" exitCode=0 Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.233094 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1"} Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.233190 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"80457ad8997939dc8e0991d051b5ca049affdba095f79270711bc1380ced8db4"} Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.234050 4722 generic.go:334] "Generic (PLEG): container finished" podID="285e13d6-a3ce-4bc2-9be4-bb6db3593a0d" containerID="c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330" exitCode=0 Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.234166 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" event={"ID":"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d","Type":"ContainerDied","Data":"c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330"} Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.247143 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.291432 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.328065 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.372828 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.409852 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.449662 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.493925 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.531175 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.569044 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.608653 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.647799 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.693247 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.734320 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.769625 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.808498 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:43 crc kubenswrapper[4722]: I0219 19:18:43.855974 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:43Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.025268 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.025446 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:18:48.02540642 +0000 UTC m=+27.637756794 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.031246 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 23:14:07.886909713 +0000 UTC Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.070973 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.071035 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.071219 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.071260 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.071003 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.071870 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.126539 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.126596 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.126627 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.126692 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126715 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126826 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126840 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:48.126815558 +0000 UTC m=+27.739165922 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126825 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126865 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126850 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126881 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126889 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126928 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:48.126911451 +0000 UTC m=+27.739261785 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126952 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:48.126940712 +0000 UTC m=+27.739291046 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.126964 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:44 crc kubenswrapper[4722]: E0219 19:18:44.127109 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:48.127079156 +0000 UTC m=+27.739429520 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.242967 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd"} Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.243014 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627"} Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.243027 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10"} Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.243040 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce"} Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.243050 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700"} Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.243061 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9"} Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.245796 4722 generic.go:334] "Generic (PLEG): container finished" podID="285e13d6-a3ce-4bc2-9be4-bb6db3593a0d" containerID="3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1" exitCode=0 Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.245827 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" event={"ID":"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d","Type":"ContainerDied","Data":"3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1"} Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.259978 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.282432 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.295200 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.311281 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.324560 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.337750 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.353457 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.369349 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.379263 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.388275 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.403085 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.420367 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:44 crc kubenswrapper[4722]: I0219 19:18:44.434018 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:44Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.032402 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 16:36:58.029039836 +0000 UTC Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.250704 4722 generic.go:334] "Generic (PLEG): container finished" podID="285e13d6-a3ce-4bc2-9be4-bb6db3593a0d" containerID="bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b" exitCode=0 Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.250753 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" event={"ID":"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d","Type":"ContainerDied","Data":"bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b"} Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.276234 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.288188 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.303804 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.316627 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.327381 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.339537 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.350254 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.365267 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.378543 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.390268 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.408435 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.419906 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:45 crc kubenswrapper[4722]: I0219 19:18:45.430425 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:45Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.033076 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 12:33:09.769161777 +0000 UTC Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.071076 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.071146 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.071080 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:46 crc kubenswrapper[4722]: E0219 19:18:46.071297 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:18:46 crc kubenswrapper[4722]: E0219 19:18:46.071433 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:18:46 crc kubenswrapper[4722]: E0219 19:18:46.071603 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.257605 4722 generic.go:334] "Generic (PLEG): container finished" podID="285e13d6-a3ce-4bc2-9be4-bb6db3593a0d" containerID="cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b" exitCode=0 Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.257647 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" event={"ID":"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d","Type":"ContainerDied","Data":"cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b"} Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.264415 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96"} Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.277301 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.293235 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.310195 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.325680 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.331089 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.334903 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.339862 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.339989 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.352646 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.370067 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.384842 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.401168 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.409731 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.424476 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.434671 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.450852 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.460641 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.476983 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.492326 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.509530 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.529766 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.548740 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.561485 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.573364 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.584054 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.596310 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.611051 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.627341 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.638841 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.651035 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.768965 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.770999 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.771037 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.771046 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.771185 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.779598 4722 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.779981 4722 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.786199 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.786240 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.786250 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.786265 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.786274 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:46Z","lastTransitionTime":"2026-02-19T19:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:46 crc kubenswrapper[4722]: E0219 19:18:46.813197 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.817321 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.817362 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.817378 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.817394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.817404 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:46Z","lastTransitionTime":"2026-02-19T19:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:46 crc kubenswrapper[4722]: E0219 19:18:46.830094 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.834955 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.834998 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.835012 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.835030 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.835043 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:46Z","lastTransitionTime":"2026-02-19T19:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:46 crc kubenswrapper[4722]: E0219 19:18:46.849279 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.853441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.853477 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.853493 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.853514 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.853529 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:46Z","lastTransitionTime":"2026-02-19T19:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:46 crc kubenswrapper[4722]: E0219 19:18:46.869516 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.874900 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.874944 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.874958 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.874986 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.875011 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:46Z","lastTransitionTime":"2026-02-19T19:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:46 crc kubenswrapper[4722]: E0219 19:18:46.890175 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:46Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:46 crc kubenswrapper[4722]: E0219 19:18:46.890333 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.892682 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.892713 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.892725 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.892763 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.892778 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:46Z","lastTransitionTime":"2026-02-19T19:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.996266 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.996334 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.996357 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.996386 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:46 crc kubenswrapper[4722]: I0219 19:18:46.996408 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:46Z","lastTransitionTime":"2026-02-19T19:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.033891 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 13:47:12.644790123 +0000 UTC Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.099814 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.099920 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.100022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.100130 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.100258 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:47Z","lastTransitionTime":"2026-02-19T19:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.203032 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.203416 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.203435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.203460 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.203479 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:47Z","lastTransitionTime":"2026-02-19T19:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.271759 4722 generic.go:334] "Generic (PLEG): container finished" podID="285e13d6-a3ce-4bc2-9be4-bb6db3593a0d" containerID="c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009" exitCode=0 Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.271881 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" event={"ID":"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d","Type":"ContainerDied","Data":"c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009"} Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.290134 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.306525 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.306574 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.306589 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.306609 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.306624 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:47Z","lastTransitionTime":"2026-02-19T19:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.307015 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.322505 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.341309 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.356408 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.373774 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.391383 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.407560 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.409718 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.409764 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.410406 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.410597 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.410657 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:47Z","lastTransitionTime":"2026-02-19T19:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.423928 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.435944 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.448731 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.460669 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.476353 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.490367 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:47Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.513243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.513322 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.513347 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.513381 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.513405 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:47Z","lastTransitionTime":"2026-02-19T19:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.616247 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.616317 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.616332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.616351 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.616366 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:47Z","lastTransitionTime":"2026-02-19T19:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.719671 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.719741 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.719760 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.719788 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.719808 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:47Z","lastTransitionTime":"2026-02-19T19:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.822396 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.822434 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.822444 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.822458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.822467 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:47Z","lastTransitionTime":"2026-02-19T19:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.925109 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.925201 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.925215 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.925235 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:47 crc kubenswrapper[4722]: I0219 19:18:47.925250 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:47Z","lastTransitionTime":"2026-02-19T19:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.027750 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.027793 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.027805 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.027822 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.027836 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:48Z","lastTransitionTime":"2026-02-19T19:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.034133 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 15:04:38.911996376 +0000 UTC Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.066618 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.066872 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:18:56.06685158 +0000 UTC m=+35.679201914 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.070615 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.070642 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.070688 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.070742 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.070870 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.070989 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.130221 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.130253 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.130263 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.130278 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.130288 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:48Z","lastTransitionTime":"2026-02-19T19:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.167532 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.167583 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.167610 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.167634 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.167753 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.167849 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:56.167830935 +0000 UTC m=+35.780181259 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.167923 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.167937 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.167949 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.167979 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:56.167970549 +0000 UTC m=+35.780320873 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.167989 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.168014 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.168039 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.168050 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.168084 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:56.168052422 +0000 UTC m=+35.780402786 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:48 crc kubenswrapper[4722]: E0219 19:18:48.168121 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:56.168103483 +0000 UTC m=+35.780453837 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.233549 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.233610 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.233626 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.233647 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.233663 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:48Z","lastTransitionTime":"2026-02-19T19:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.280752 4722 generic.go:334] "Generic (PLEG): container finished" podID="285e13d6-a3ce-4bc2-9be4-bb6db3593a0d" containerID="6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61" exitCode=0 Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.280833 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" event={"ID":"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d","Type":"ContainerDied","Data":"6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.288003 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.288451 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.288493 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.299371 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.323735 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.326643 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.335704 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.335737 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.335748 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.335764 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.335777 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:48Z","lastTransitionTime":"2026-02-19T19:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.344650 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.360801 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.373584 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.394494 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.406595 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.420213 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.433622 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.437860 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.437899 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.437908 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.437922 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.437935 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:48Z","lastTransitionTime":"2026-02-19T19:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.447818 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.461647 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.477657 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.487928 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.496770 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.509228 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.520068 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.529930 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.537960 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.540066 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.540111 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.540122 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.540137 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.540163 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:48Z","lastTransitionTime":"2026-02-19T19:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.551210 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.562192 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.571860 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.581452 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.589091 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.599932 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.611146 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.629504 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.640073 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.642785 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.642830 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.642843 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.642860 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.642872 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:48Z","lastTransitionTime":"2026-02-19T19:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.656271 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:48Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.744954 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.744999 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.745013 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.745029 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.745041 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:48Z","lastTransitionTime":"2026-02-19T19:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.847247 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.847288 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.847299 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.847317 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.847329 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:48Z","lastTransitionTime":"2026-02-19T19:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.921883 4722 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.949415 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.949454 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.949465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.949481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:48 crc kubenswrapper[4722]: I0219 19:18:48.949494 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:48Z","lastTransitionTime":"2026-02-19T19:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.034857 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 02:07:25.167277077 +0000 UTC Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.052034 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.052071 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.052080 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.052095 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.052105 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:49Z","lastTransitionTime":"2026-02-19T19:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.111688 4722 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.155020 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.155065 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.155079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.155097 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.155109 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:49Z","lastTransitionTime":"2026-02-19T19:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.257083 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.257211 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.257237 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.257268 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.257292 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:49Z","lastTransitionTime":"2026-02-19T19:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.295830 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" event={"ID":"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d","Type":"ContainerStarted","Data":"eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.296440 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.313836 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.329117 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.339318 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.351462 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.359567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.359602 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.359613 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.359628 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.359639 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:49Z","lastTransitionTime":"2026-02-19T19:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.365517 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.375604 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.388853 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.402601 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.415626 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.428378 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.442280 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.460861 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.462230 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.462276 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.462292 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.462313 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.462328 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:49Z","lastTransitionTime":"2026-02-19T19:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.472514 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.489886 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.505364 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.521285 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.535565 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.552660 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.565308 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.565366 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.565385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.565409 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.565428 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:49Z","lastTransitionTime":"2026-02-19T19:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.566483 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.579730 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.598055 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.617399 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.631684 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.643406 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.658716 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.667783 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.667883 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.667910 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.667940 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.667977 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:49Z","lastTransitionTime":"2026-02-19T19:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.671382 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.694712 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.707947 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.722314 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:49Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.770807 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.770864 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.770882 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.770908 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.770926 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:49Z","lastTransitionTime":"2026-02-19T19:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.873117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.873161 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.873172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.873184 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.873192 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:49Z","lastTransitionTime":"2026-02-19T19:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.975246 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.975307 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.975327 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.975379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:49 crc kubenswrapper[4722]: I0219 19:18:49.975398 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:49Z","lastTransitionTime":"2026-02-19T19:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.035455 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 06:18:18.264110595 +0000 UTC Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.071204 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.071288 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.071294 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:50 crc kubenswrapper[4722]: E0219 19:18:50.071438 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:18:50 crc kubenswrapper[4722]: E0219 19:18:50.071546 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:18:50 crc kubenswrapper[4722]: E0219 19:18:50.071702 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.078874 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.078928 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.078946 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.078993 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.079020 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:50Z","lastTransitionTime":"2026-02-19T19:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.182679 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.182817 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.182843 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.183288 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.183308 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:50Z","lastTransitionTime":"2026-02-19T19:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.286489 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.286544 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.286556 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.286571 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.286582 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:50Z","lastTransitionTime":"2026-02-19T19:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.389790 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.390060 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.390106 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.390200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.390227 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:50Z","lastTransitionTime":"2026-02-19T19:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.492780 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.492822 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.492848 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.492866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.492877 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:50Z","lastTransitionTime":"2026-02-19T19:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.595398 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.595448 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.595465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.595485 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.595501 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:50Z","lastTransitionTime":"2026-02-19T19:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.698107 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.698188 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.698207 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.698233 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.698251 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:50Z","lastTransitionTime":"2026-02-19T19:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.813227 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.813265 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.813273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.813287 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.813298 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:50Z","lastTransitionTime":"2026-02-19T19:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.877134 4722 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.915949 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.915979 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.915988 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.916013 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:50 crc kubenswrapper[4722]: I0219 19:18:50.916036 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:50Z","lastTransitionTime":"2026-02-19T19:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.018966 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.019028 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.019046 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.019071 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.019090 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:51Z","lastTransitionTime":"2026-02-19T19:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.036502 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 02:35:40.319276734 +0000 UTC Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.088540 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.104614 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.118225 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.122854 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.122910 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.122926 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.122948 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.122963 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:51Z","lastTransitionTime":"2026-02-19T19:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.132209 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.144299 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.165996 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.182379 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.196628 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.208263 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.225629 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.225670 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.225681 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.225698 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.225710 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:51Z","lastTransitionTime":"2026-02-19T19:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.232654 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.243707 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.260622 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.276741 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.291637 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:51Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.327963 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.328011 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.328023 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.328041 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.328055 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:51Z","lastTransitionTime":"2026-02-19T19:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.429970 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.430316 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.430325 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.430339 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.430349 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:51Z","lastTransitionTime":"2026-02-19T19:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.548074 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.548191 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.548217 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.548248 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.548273 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:51Z","lastTransitionTime":"2026-02-19T19:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.651053 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.651101 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.651114 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.651132 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.651145 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:51Z","lastTransitionTime":"2026-02-19T19:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.754828 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.754906 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.754927 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.754955 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.754973 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:51Z","lastTransitionTime":"2026-02-19T19:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.858052 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.858133 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.858146 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.858192 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.858206 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:51Z","lastTransitionTime":"2026-02-19T19:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.961813 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.961920 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.961954 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.961996 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:51 crc kubenswrapper[4722]: I0219 19:18:51.962037 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:51Z","lastTransitionTime":"2026-02-19T19:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.037347 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 00:19:31.533773245 +0000 UTC Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.064406 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.064438 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.064448 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.064460 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.064471 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:52Z","lastTransitionTime":"2026-02-19T19:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.070827 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:52 crc kubenswrapper[4722]: E0219 19:18:52.070912 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.070927 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.070942 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:52 crc kubenswrapper[4722]: E0219 19:18:52.070982 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:18:52 crc kubenswrapper[4722]: E0219 19:18:52.071038 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.166566 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.166633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.166652 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.166680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.166698 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:52Z","lastTransitionTime":"2026-02-19T19:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.270370 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.270437 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.270456 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.270481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.270498 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:52Z","lastTransitionTime":"2026-02-19T19:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.307996 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/0.log" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.312175 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218" exitCode=1 Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.312218 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.313041 4722 scope.go:117] "RemoveContainer" containerID="4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.337987 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.357616 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.374383 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.374429 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.374440 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.374457 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.374469 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:52Z","lastTransitionTime":"2026-02-19T19:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.379281 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.401949 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.414525 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.428202 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.441655 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.456021 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.476736 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.480474 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.480553 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.480577 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.480607 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.480629 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:52Z","lastTransitionTime":"2026-02-19T19:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.494914 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.508192 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.521018 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.546815 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.571095 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:51Z\\\",\\\"message\\\":\\\".300302 6044 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0219 19:18:51.300324 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 19:18:51.300330 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 19:18:51.300353 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 19:18:51.300365 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 19:18:51.300369 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 19:18:51.300379 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 19:18:51.300397 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:18:51.300413 6044 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0219 19:18:51.300416 6044 factory.go:656] Stopping watch factory\\\\nI0219 19:18:51.300421 6044 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0219 19:18:51.300428 6044 ovnkube.go:599] Stopped ovnkube\\\\nI0219 19:18:51.300436 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:18:51.300445 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 19:18:51.300450 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 19:18:51.300457 6044 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:52Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.582783 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.582819 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.582832 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.582846 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.582856 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:52Z","lastTransitionTime":"2026-02-19T19:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.685117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.685184 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.685196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.685213 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.685223 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:52Z","lastTransitionTime":"2026-02-19T19:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.787352 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.787402 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.787419 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.787442 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.787459 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:52Z","lastTransitionTime":"2026-02-19T19:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.890893 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.890967 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.890992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.891022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.891042 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:52Z","lastTransitionTime":"2026-02-19T19:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.993859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.993907 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.993919 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.993936 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:52 crc kubenswrapper[4722]: I0219 19:18:52.993946 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:52Z","lastTransitionTime":"2026-02-19T19:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.037890 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 11:52:56.559305319 +0000 UTC Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.095520 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.095582 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.095600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.095625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.095643 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:53Z","lastTransitionTime":"2026-02-19T19:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.198086 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.198124 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.198135 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.198161 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.198171 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:53Z","lastTransitionTime":"2026-02-19T19:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.300132 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.300193 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.300210 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.300224 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.300234 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:53Z","lastTransitionTime":"2026-02-19T19:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.318376 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/0.log" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.322741 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.323220 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.338897 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.353566 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.369195 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.382211 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.395248 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.403053 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.403105 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.403120 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.403141 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.403182 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:53Z","lastTransitionTime":"2026-02-19T19:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.410014 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.421658 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.436033 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.451354 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.464379 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.476901 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.489605 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.504059 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.505903 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.505947 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.505958 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.505973 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.505984 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:53Z","lastTransitionTime":"2026-02-19T19:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.525433 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:51Z\\\",\\\"message\\\":\\\".300302 6044 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0219 19:18:51.300324 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 19:18:51.300330 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 19:18:51.300353 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 19:18:51.300365 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 19:18:51.300369 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 19:18:51.300379 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 19:18:51.300397 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:18:51.300413 6044 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0219 19:18:51.300416 6044 factory.go:656] Stopping watch factory\\\\nI0219 19:18:51.300421 6044 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0219 19:18:51.300428 6044 ovnkube.go:599] Stopped ovnkube\\\\nI0219 19:18:51.300436 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:18:51.300445 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 19:18:51.300450 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 19:18:51.300457 6044 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.608830 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4"] Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.609460 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.609928 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.610028 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.610049 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.610229 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.610257 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:53Z","lastTransitionTime":"2026-02-19T19:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.611324 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.612177 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.626223 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.654490 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:51Z\\\",\\\"message\\\":\\\".300302 6044 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0219 19:18:51.300324 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 19:18:51.300330 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 19:18:51.300353 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 19:18:51.300365 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 19:18:51.300369 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 19:18:51.300379 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 19:18:51.300397 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:18:51.300413 6044 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0219 19:18:51.300416 6044 factory.go:656] Stopping watch factory\\\\nI0219 19:18:51.300421 6044 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0219 19:18:51.300428 6044 ovnkube.go:599] Stopped ovnkube\\\\nI0219 19:18:51.300436 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:18:51.300445 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 19:18:51.300450 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 19:18:51.300457 6044 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.667609 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.680586 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.696338 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.709502 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.711945 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.711975 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.711984 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.711997 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.712006 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:53Z","lastTransitionTime":"2026-02-19T19:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.721042 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20b917d0-317d-4ce9-96e2-b1aa95f89663-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.721098 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20b917d0-317d-4ce9-96e2-b1aa95f89663-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.721182 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26lqm\" (UniqueName: \"kubernetes.io/projected/20b917d0-317d-4ce9-96e2-b1aa95f89663-kube-api-access-26lqm\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.721204 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20b917d0-317d-4ce9-96e2-b1aa95f89663-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.724686 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.737393 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.752758 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.769831 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.785807 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.799965 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.810446 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.813814 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.813850 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.813862 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.813878 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.813890 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:53Z","lastTransitionTime":"2026-02-19T19:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.822020 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26lqm\" (UniqueName: \"kubernetes.io/projected/20b917d0-317d-4ce9-96e2-b1aa95f89663-kube-api-access-26lqm\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.822050 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20b917d0-317d-4ce9-96e2-b1aa95f89663-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.822078 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20b917d0-317d-4ce9-96e2-b1aa95f89663-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.822094 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20b917d0-317d-4ce9-96e2-b1aa95f89663-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.823028 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/20b917d0-317d-4ce9-96e2-b1aa95f89663-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.823860 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/20b917d0-317d-4ce9-96e2-b1aa95f89663-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.830089 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.831865 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/20b917d0-317d-4ce9-96e2-b1aa95f89663-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.846033 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:53Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.848263 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26lqm\" (UniqueName: \"kubernetes.io/projected/20b917d0-317d-4ce9-96e2-b1aa95f89663-kube-api-access-26lqm\") pod \"ovnkube-control-plane-749d76644c-qt9f4\" (UID: \"20b917d0-317d-4ce9-96e2-b1aa95f89663\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.916521 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.916584 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.916607 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.916637 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.916659 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:53Z","lastTransitionTime":"2026-02-19T19:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:53 crc kubenswrapper[4722]: I0219 19:18:53.929245 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" Feb 19 19:18:53 crc kubenswrapper[4722]: W0219 19:18:53.949781 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20b917d0_317d_4ce9_96e2_b1aa95f89663.slice/crio-2733854b6045306a84dc1e2ce927946028ef56feb6e6218a293a97a0593087fd WatchSource:0}: Error finding container 2733854b6045306a84dc1e2ce927946028ef56feb6e6218a293a97a0593087fd: Status 404 returned error can't find the container with id 2733854b6045306a84dc1e2ce927946028ef56feb6e6218a293a97a0593087fd Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.018591 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.018658 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.018700 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.018730 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.018753 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:54Z","lastTransitionTime":"2026-02-19T19:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.038311 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 14:00:41.431354651 +0000 UTC Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.070931 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.071003 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:54 crc kubenswrapper[4722]: E0219 19:18:54.071091 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.071003 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:54 crc kubenswrapper[4722]: E0219 19:18:54.071245 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:18:54 crc kubenswrapper[4722]: E0219 19:18:54.071393 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.121369 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.121411 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.121422 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.121436 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.121448 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:54Z","lastTransitionTime":"2026-02-19T19:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.224584 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.224629 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.224642 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.224658 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.224670 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:54Z","lastTransitionTime":"2026-02-19T19:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.326745 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.326790 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.326810 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.326833 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.326849 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:54Z","lastTransitionTime":"2026-02-19T19:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.329599 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" event={"ID":"20b917d0-317d-4ce9-96e2-b1aa95f89663","Type":"ContainerStarted","Data":"0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.329667 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" event={"ID":"20b917d0-317d-4ce9-96e2-b1aa95f89663","Type":"ContainerStarted","Data":"327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.329691 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" event={"ID":"20b917d0-317d-4ce9-96e2-b1aa95f89663","Type":"ContainerStarted","Data":"2733854b6045306a84dc1e2ce927946028ef56feb6e6218a293a97a0593087fd"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.333085 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/1.log" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.334647 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/0.log" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.338929 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367" exitCode=1 Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.338982 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.339048 4722 scope.go:117] "RemoveContainer" containerID="4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.340090 4722 scope.go:117] "RemoveContainer" containerID="64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367" Feb 19 19:18:54 crc kubenswrapper[4722]: E0219 19:18:54.340409 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.346102 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.367518 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.384601 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.402971 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.422736 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.429311 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.429374 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.429398 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.429427 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.429451 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:54Z","lastTransitionTime":"2026-02-19T19:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.446587 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.458394 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.469037 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.488003 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.496878 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.509016 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.519117 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.526889 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.531450 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.531480 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.531491 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.531504 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.531514 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:54Z","lastTransitionTime":"2026-02-19T19:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.537733 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.553494 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:51Z\\\",\\\"message\\\":\\\".300302 6044 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0219 19:18:51.300324 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 19:18:51.300330 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 19:18:51.300353 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 19:18:51.300365 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 19:18:51.300369 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 19:18:51.300379 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 19:18:51.300397 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:18:51.300413 6044 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0219 19:18:51.300416 6044 factory.go:656] Stopping watch factory\\\\nI0219 19:18:51.300421 6044 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0219 19:18:51.300428 6044 ovnkube.go:599] Stopped ovnkube\\\\nI0219 19:18:51.300436 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:18:51.300445 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 19:18:51.300450 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 19:18:51.300457 6044 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.572172 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:51Z\\\",\\\"message\\\":\\\".300302 6044 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0219 19:18:51.300324 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 19:18:51.300330 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 19:18:51.300353 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 19:18:51.300365 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 19:18:51.300369 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 19:18:51.300379 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 19:18:51.300397 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:18:51.300413 6044 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0219 19:18:51.300416 6044 factory.go:656] Stopping watch factory\\\\nI0219 19:18:51.300421 6044 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0219 19:18:51.300428 6044 ovnkube.go:599] Stopped ovnkube\\\\nI0219 19:18:51.300436 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:18:51.300445 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 19:18:51.300450 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 19:18:51.300457 6044 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 19:18:53.334284 6191 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 2.488018ms)\\\\nI0219 19:18:53.334422 6191 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334476 6191 services_controller.go:453] Built service openshift-kube-scheduler-operator/metrics template LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334483 6191 services_controller.go:454] Service openshift-kube-scheduler-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0219 19:18:53.334406 6191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.584707 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.594401 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.604543 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.613885 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.621453 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.629761 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.633481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.633514 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.633525 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.633540 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.633553 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:54Z","lastTransitionTime":"2026-02-19T19:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.640423 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.651203 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.661572 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.675044 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.686787 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.696863 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.714678 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.730657 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:54Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.736804 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.736861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.736876 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.736893 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.736903 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:54Z","lastTransitionTime":"2026-02-19T19:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.840110 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.840166 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.840176 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.840190 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.840200 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:54Z","lastTransitionTime":"2026-02-19T19:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.943678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.943728 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.943740 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.943756 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:54 crc kubenswrapper[4722]: I0219 19:18:54.943768 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:54Z","lastTransitionTime":"2026-02-19T19:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.039125 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 00:59:23.403723676 +0000 UTC Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.046854 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.046931 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.046956 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.046985 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.047009 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:55Z","lastTransitionTime":"2026-02-19T19:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.107783 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-s6hhp"] Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.112263 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:55 crc kubenswrapper[4722]: E0219 19:18:55.112432 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.133691 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.149836 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.149896 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.149920 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.149949 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.149973 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:55Z","lastTransitionTime":"2026-02-19T19:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.151751 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.169264 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.182928 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.198586 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.219042 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.235318 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpp4f\" (UniqueName: \"kubernetes.io/projected/493acad5-7300-4941-9311-19b3d5f21786-kube-api-access-gpp4f\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.235417 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.235203 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.253358 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.253421 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.253437 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.253459 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.253474 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:55Z","lastTransitionTime":"2026-02-19T19:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.254702 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.273561 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.288214 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.303727 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.326570 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.336247 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.336320 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpp4f\" (UniqueName: \"kubernetes.io/projected/493acad5-7300-4941-9311-19b3d5f21786-kube-api-access-gpp4f\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:55 crc kubenswrapper[4722]: E0219 19:18:55.336507 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:18:55 crc kubenswrapper[4722]: E0219 19:18:55.336625 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs podName:493acad5-7300-4941-9311-19b3d5f21786 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:55.836595654 +0000 UTC m=+35.448945998 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs") pod "network-metrics-daemon-s6hhp" (UID: "493acad5-7300-4941-9311-19b3d5f21786") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.343528 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.345288 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/1.log" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.350441 4722 scope.go:117] "RemoveContainer" containerID="64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367" Feb 19 19:18:55 crc kubenswrapper[4722]: E0219 19:18:55.350700 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.356838 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.356934 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.356962 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.356999 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.357031 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:55Z","lastTransitionTime":"2026-02-19T19:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.359422 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.368767 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpp4f\" (UniqueName: \"kubernetes.io/projected/493acad5-7300-4941-9311-19b3d5f21786-kube-api-access-gpp4f\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.390568 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d010ba1ad7ba48546a981c9f61e05805f6580fef3620337534da745e5a70218\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:51Z\\\",\\\"message\\\":\\\".300302 6044 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0219 19:18:51.300324 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 19:18:51.300330 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 19:18:51.300353 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 19:18:51.300365 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 19:18:51.300369 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 19:18:51.300379 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 19:18:51.300397 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:18:51.300413 6044 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0219 19:18:51.300416 6044 factory.go:656] Stopping watch factory\\\\nI0219 19:18:51.300421 6044 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0219 19:18:51.300428 6044 ovnkube.go:599] Stopped ovnkube\\\\nI0219 19:18:51.300436 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:18:51.300445 6044 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 19:18:51.300450 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 19:18:51.300457 6044 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 19:18:53.334284 6191 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 2.488018ms)\\\\nI0219 19:18:53.334422 6191 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334476 6191 services_controller.go:453] Built service openshift-kube-scheduler-operator/metrics template LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334483 6191 services_controller.go:454] Service openshift-kube-scheduler-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0219 19:18:53.334406 6191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.406524 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.425415 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.438501 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.455039 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.459529 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.459574 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.459588 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.459611 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.459624 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:55Z","lastTransitionTime":"2026-02-19T19:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.471943 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.496596 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.509582 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.526673 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.541821 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.552567 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.561535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.561563 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.561573 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.561587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.561599 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:55Z","lastTransitionTime":"2026-02-19T19:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.565101 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.575816 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.587399 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.603777 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 19:18:53.334284 6191 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 2.488018ms)\\\\nI0219 19:18:53.334422 6191 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334476 6191 services_controller.go:453] Built service openshift-kube-scheduler-operator/metrics template LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334483 6191 services_controller.go:454] Service openshift-kube-scheduler-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0219 19:18:53.334406 6191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.614108 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.626701 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.638240 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:55Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.667991 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.668036 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.668047 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.668062 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.668076 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:55Z","lastTransitionTime":"2026-02-19T19:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.770094 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.770456 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.770635 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.770758 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.770820 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:55Z","lastTransitionTime":"2026-02-19T19:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.841924 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:55 crc kubenswrapper[4722]: E0219 19:18:55.842276 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:18:55 crc kubenswrapper[4722]: E0219 19:18:55.842405 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs podName:493acad5-7300-4941-9311-19b3d5f21786 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:56.842368587 +0000 UTC m=+36.454718961 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs") pod "network-metrics-daemon-s6hhp" (UID: "493acad5-7300-4941-9311-19b3d5f21786") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.874584 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.874652 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.874675 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.874705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.874727 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:55Z","lastTransitionTime":"2026-02-19T19:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.977521 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.977583 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.977606 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.977631 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:55 crc kubenswrapper[4722]: I0219 19:18:55.977650 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:55Z","lastTransitionTime":"2026-02-19T19:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.039355 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 18:53:36.731192633 +0000 UTC Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.071227 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.071468 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.071267 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.071261 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.071722 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.071605 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.081718 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.081791 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.081815 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.081848 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.081873 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:56Z","lastTransitionTime":"2026-02-19T19:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.145771 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.146034 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:19:12.146004293 +0000 UTC m=+51.758354647 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.184692 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.184787 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.184806 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.184829 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.184848 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:56Z","lastTransitionTime":"2026-02-19T19:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.247464 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.247533 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.247575 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.247642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.247681 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.247785 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:12.247760813 +0000 UTC m=+51.860111167 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.247814 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.247837 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.247833 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.247934 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:12.247911778 +0000 UTC m=+51.860262122 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.247835 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.247976 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.247856 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.248109 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:12.248077673 +0000 UTC m=+51.860428027 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.247991 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.248199 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:12.248187576 +0000 UTC m=+51.860537910 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.287837 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.287942 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.287965 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.287995 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.288014 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:56Z","lastTransitionTime":"2026-02-19T19:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.390636 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.390712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.390734 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.390764 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.390789 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:56Z","lastTransitionTime":"2026-02-19T19:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.494506 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.494569 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.494586 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.494611 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.494630 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:56Z","lastTransitionTime":"2026-02-19T19:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.596831 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.596877 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.596889 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.596907 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.596918 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:56Z","lastTransitionTime":"2026-02-19T19:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.700728 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.700781 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.700793 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.700811 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.700824 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:56Z","lastTransitionTime":"2026-02-19T19:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.804279 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.804317 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.804327 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.804342 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.804355 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:56Z","lastTransitionTime":"2026-02-19T19:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.860299 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.860493 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:18:56 crc kubenswrapper[4722]: E0219 19:18:56.860586 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs podName:493acad5-7300-4941-9311-19b3d5f21786 nodeName:}" failed. No retries permitted until 2026-02-19 19:18:58.860563319 +0000 UTC m=+38.472913703 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs") pod "network-metrics-daemon-s6hhp" (UID: "493acad5-7300-4941-9311-19b3d5f21786") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.906821 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.906860 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.906869 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.906883 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:56 crc kubenswrapper[4722]: I0219 19:18:56.906893 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:56Z","lastTransitionTime":"2026-02-19T19:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.009459 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.009499 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.009516 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.009537 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.009549 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.039537 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 04:38:40.736678785 +0000 UTC Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.071193 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:57 crc kubenswrapper[4722]: E0219 19:18:57.071357 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.113305 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.113385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.113407 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.113440 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.113463 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.133391 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.133450 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.133468 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.133492 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.133510 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: E0219 19:18:57.155662 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:57Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.161502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.161550 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.161567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.161593 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.161611 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: E0219 19:18:57.183529 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:57Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.188754 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.188846 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.188866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.188892 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.188910 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: E0219 19:18:57.204886 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:57Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.212633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.212690 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.212710 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.212733 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.212750 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: E0219 19:18:57.225486 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:57Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.229258 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.229294 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.229308 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.229326 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.229339 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: E0219 19:18:57.241333 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:57Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:57 crc kubenswrapper[4722]: E0219 19:18:57.241557 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.243354 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.243392 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.243402 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.243418 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.243430 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.346032 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.346087 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.346102 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.346118 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.346127 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.449143 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.449200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.449210 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.449227 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.449239 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.552763 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.552809 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.552820 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.552838 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.552850 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.655756 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.655803 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.655816 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.655833 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.655846 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.758586 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.758675 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.758688 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.758705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.758717 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.861656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.861712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.861735 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.861760 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.861779 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.964318 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.964364 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.964375 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.964394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:57 crc kubenswrapper[4722]: I0219 19:18:57.964408 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:57Z","lastTransitionTime":"2026-02-19T19:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.039696 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 12:06:51.551069931 +0000 UTC Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.067511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.067580 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.067593 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.067616 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.067632 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:58Z","lastTransitionTime":"2026-02-19T19:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.070912 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.070997 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.071025 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:18:58 crc kubenswrapper[4722]: E0219 19:18:58.071084 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:18:58 crc kubenswrapper[4722]: E0219 19:18:58.071296 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:18:58 crc kubenswrapper[4722]: E0219 19:18:58.071428 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.170280 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.170372 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.170387 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.170402 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.170414 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:58Z","lastTransitionTime":"2026-02-19T19:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.273615 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.273678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.273699 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.273719 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.273730 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:58Z","lastTransitionTime":"2026-02-19T19:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.376262 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.376317 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.376329 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.376347 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.376358 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:58Z","lastTransitionTime":"2026-02-19T19:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.479874 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.479969 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.479994 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.480024 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.480047 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:58Z","lastTransitionTime":"2026-02-19T19:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.583551 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.583611 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.583628 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.583688 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.583706 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:58Z","lastTransitionTime":"2026-02-19T19:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.686234 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.686297 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.686318 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.686337 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.686350 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:58Z","lastTransitionTime":"2026-02-19T19:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.759715 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.784748 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.789328 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.789389 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.789412 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.789471 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.789494 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:58Z","lastTransitionTime":"2026-02-19T19:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.800670 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.818228 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.833969 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.847815 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.864996 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.879791 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:58 crc kubenswrapper[4722]: E0219 19:18:58.880038 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:18:58 crc kubenswrapper[4722]: E0219 19:18:58.880106 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs podName:493acad5-7300-4941-9311-19b3d5f21786 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:02.880084797 +0000 UTC m=+42.492435151 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs") pod "network-metrics-daemon-s6hhp" (UID: "493acad5-7300-4941-9311-19b3d5f21786") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.886618 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.892078 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.892137 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.892191 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.892220 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.892238 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:58Z","lastTransitionTime":"2026-02-19T19:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.907916 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.929579 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.950437 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.964469 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:58 crc kubenswrapper[4722]: I0219 19:18:58.987293 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:58Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.001063 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.001145 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.001211 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.001242 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.001264 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:59Z","lastTransitionTime":"2026-02-19T19:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.006550 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:59Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.020132 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:59Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.036773 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:59Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.039825 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 02:43:46.344129358 +0000 UTC Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.084718 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.084672 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 19:18:53.334284 6191 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 2.488018ms)\\\\nI0219 19:18:53.334422 6191 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334476 6191 services_controller.go:453] Built service openshift-kube-scheduler-operator/metrics template LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334483 6191 services_controller.go:454] Service openshift-kube-scheduler-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0219 19:18:53.334406 6191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:18:59Z is after 2025-08-24T17:21:41Z" Feb 19 19:18:59 crc kubenswrapper[4722]: E0219 19:18:59.084909 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.103768 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.104011 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.104108 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.104209 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.104313 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:59Z","lastTransitionTime":"2026-02-19T19:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.207870 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.208132 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.208229 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.208319 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.208389 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:59Z","lastTransitionTime":"2026-02-19T19:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.311225 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.311281 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.311298 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.311323 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.311339 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:59Z","lastTransitionTime":"2026-02-19T19:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.415254 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.415294 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.415306 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.415322 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.415335 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:59Z","lastTransitionTime":"2026-02-19T19:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.518825 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.518861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.518873 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.518891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.518903 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:59Z","lastTransitionTime":"2026-02-19T19:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.622451 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.622502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.622519 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.622541 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.622558 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:59Z","lastTransitionTime":"2026-02-19T19:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.725676 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.725732 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.725748 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.725777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.725818 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:59Z","lastTransitionTime":"2026-02-19T19:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.829613 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.829680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.829698 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.829723 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.829741 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:59Z","lastTransitionTime":"2026-02-19T19:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.933334 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.933673 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.933898 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.934079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:18:59 crc kubenswrapper[4722]: I0219 19:18:59.934361 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:18:59Z","lastTransitionTime":"2026-02-19T19:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.037856 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.037923 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.037947 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.037974 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.037997 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:00Z","lastTransitionTime":"2026-02-19T19:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.040507 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 05:55:35.662232458 +0000 UTC Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.070900 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.070940 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.070900 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:00 crc kubenswrapper[4722]: E0219 19:19:00.071084 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:00 crc kubenswrapper[4722]: E0219 19:19:00.071189 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:00 crc kubenswrapper[4722]: E0219 19:19:00.071302 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.141796 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.141864 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.141887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.141917 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.141942 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:00Z","lastTransitionTime":"2026-02-19T19:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.245111 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.245208 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.245256 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.245284 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.245303 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:00Z","lastTransitionTime":"2026-02-19T19:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.348414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.348472 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.348490 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.348520 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.348544 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:00Z","lastTransitionTime":"2026-02-19T19:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.450712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.450767 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.450781 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.450801 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.450816 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:00Z","lastTransitionTime":"2026-02-19T19:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.554044 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.554385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.554426 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.554757 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.554771 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:00Z","lastTransitionTime":"2026-02-19T19:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.658087 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.658137 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.658196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.658227 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.658251 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:00Z","lastTransitionTime":"2026-02-19T19:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.760982 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.761185 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.761213 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.761281 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.761308 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:00Z","lastTransitionTime":"2026-02-19T19:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.864573 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.864656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.864715 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.864749 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.864770 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:00Z","lastTransitionTime":"2026-02-19T19:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.968319 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.968399 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.968427 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.968454 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:00 crc kubenswrapper[4722]: I0219 19:19:00.968478 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:00Z","lastTransitionTime":"2026-02-19T19:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.040655 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 04:30:27.898358249 +0000 UTC Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.070498 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:01 crc kubenswrapper[4722]: E0219 19:19:01.072723 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.077517 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.077751 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.077902 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.078055 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.078216 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:01Z","lastTransitionTime":"2026-02-19T19:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.092091 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.110317 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.131497 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.149558 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.168271 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.181567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.181621 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.181638 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.181666 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.181684 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:01Z","lastTransitionTime":"2026-02-19T19:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.185063 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.204252 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.232517 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 19:18:53.334284 6191 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 2.488018ms)\\\\nI0219 19:18:53.334422 6191 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334476 6191 services_controller.go:453] Built service openshift-kube-scheduler-operator/metrics template LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334483 6191 services_controller.go:454] Service openshift-kube-scheduler-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0219 19:18:53.334406 6191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.246521 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.270803 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.284542 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.284586 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.284602 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.284621 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.284638 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:01Z","lastTransitionTime":"2026-02-19T19:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.295044 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.314316 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.328097 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.342881 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.355338 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.374037 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:01Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.387788 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.388186 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.388378 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.388679 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.389055 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:01Z","lastTransitionTime":"2026-02-19T19:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.492453 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.492514 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.492538 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.492566 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.492588 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:01Z","lastTransitionTime":"2026-02-19T19:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.595008 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.595051 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.595061 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.595079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.595092 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:01Z","lastTransitionTime":"2026-02-19T19:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.698242 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.698309 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.698332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.698361 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.698383 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:01Z","lastTransitionTime":"2026-02-19T19:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.800263 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.800302 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.800312 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.800327 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.800336 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:01Z","lastTransitionTime":"2026-02-19T19:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.903730 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.903797 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.903818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.903845 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:01 crc kubenswrapper[4722]: I0219 19:19:01.903870 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:01Z","lastTransitionTime":"2026-02-19T19:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.006306 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.006371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.006388 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.006412 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.006430 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:02Z","lastTransitionTime":"2026-02-19T19:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.041552 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 12:50:43.977918212 +0000 UTC Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.070986 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.071090 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.071180 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:02 crc kubenswrapper[4722]: E0219 19:19:02.071366 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:02 crc kubenswrapper[4722]: E0219 19:19:02.071518 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:02 crc kubenswrapper[4722]: E0219 19:19:02.071693 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.109587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.109916 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.110051 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.110358 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.110513 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:02Z","lastTransitionTime":"2026-02-19T19:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.213537 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.213656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.213683 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.213712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.213733 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:02Z","lastTransitionTime":"2026-02-19T19:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.316575 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.316930 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.317067 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.317294 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.317430 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:02Z","lastTransitionTime":"2026-02-19T19:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.420481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.420538 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.420558 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.420583 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.420633 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:02Z","lastTransitionTime":"2026-02-19T19:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.524073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.525103 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.525351 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.525519 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.525668 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:02Z","lastTransitionTime":"2026-02-19T19:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.629301 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.629378 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.629402 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.629431 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.629453 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:02Z","lastTransitionTime":"2026-02-19T19:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.732630 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.732691 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.732722 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.732770 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.732798 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:02Z","lastTransitionTime":"2026-02-19T19:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.835461 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.835534 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.835560 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.835593 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.835614 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:02Z","lastTransitionTime":"2026-02-19T19:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.924744 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:02 crc kubenswrapper[4722]: E0219 19:19:02.924921 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:19:02 crc kubenswrapper[4722]: E0219 19:19:02.925600 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs podName:493acad5-7300-4941-9311-19b3d5f21786 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:10.925564294 +0000 UTC m=+50.537914658 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs") pod "network-metrics-daemon-s6hhp" (UID: "493acad5-7300-4941-9311-19b3d5f21786") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.939270 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.939318 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.939330 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.939349 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:02 crc kubenswrapper[4722]: I0219 19:19:02.939364 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:02Z","lastTransitionTime":"2026-02-19T19:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.041735 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 19:13:16.842382981 +0000 UTC Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.041968 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.042047 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.042073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.042144 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.042218 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:03Z","lastTransitionTime":"2026-02-19T19:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.070722 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:03 crc kubenswrapper[4722]: E0219 19:19:03.070954 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.145048 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.145100 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.145117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.145142 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.145193 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:03Z","lastTransitionTime":"2026-02-19T19:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.247332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.247368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.247379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.247394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.247405 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:03Z","lastTransitionTime":"2026-02-19T19:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.350389 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.350451 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.350471 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.350496 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.350514 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:03Z","lastTransitionTime":"2026-02-19T19:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.453467 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.453858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.454041 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.454401 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.454709 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:03Z","lastTransitionTime":"2026-02-19T19:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.557554 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.557631 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.557653 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.557683 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.557706 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:03Z","lastTransitionTime":"2026-02-19T19:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.660534 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.660829 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.661050 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.661258 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.661436 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:03Z","lastTransitionTime":"2026-02-19T19:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.764680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.765833 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.765879 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.765908 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.765932 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:03Z","lastTransitionTime":"2026-02-19T19:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.869532 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.869618 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.869637 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.869665 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.869685 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:03Z","lastTransitionTime":"2026-02-19T19:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.972510 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.972940 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.973330 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.973687 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:03 crc kubenswrapper[4722]: I0219 19:19:03.974066 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:03Z","lastTransitionTime":"2026-02-19T19:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.042524 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 20:04:50.819418203 +0000 UTC Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.070848 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:04 crc kubenswrapper[4722]: E0219 19:19:04.071281 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.070999 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:04 crc kubenswrapper[4722]: E0219 19:19:04.071640 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.070953 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:04 crc kubenswrapper[4722]: E0219 19:19:04.071861 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.077857 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.077924 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.077969 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.077999 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.078016 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:04Z","lastTransitionTime":"2026-02-19T19:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.181830 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.182245 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.182503 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.182767 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.183004 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:04Z","lastTransitionTime":"2026-02-19T19:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.285992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.286458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.286635 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.286802 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.286980 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:04Z","lastTransitionTime":"2026-02-19T19:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.390262 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.390643 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.390796 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.390947 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.391090 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:04Z","lastTransitionTime":"2026-02-19T19:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.494517 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.494585 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.494602 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.494627 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.494645 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:04Z","lastTransitionTime":"2026-02-19T19:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.598112 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.598208 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.598226 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.598253 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.598270 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:04Z","lastTransitionTime":"2026-02-19T19:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.701344 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.701465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.701484 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.701512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.701530 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:04Z","lastTransitionTime":"2026-02-19T19:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.804398 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.805240 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.805286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.805323 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.805346 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:04Z","lastTransitionTime":"2026-02-19T19:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.908972 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.909027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.909045 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.909067 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:04 crc kubenswrapper[4722]: I0219 19:19:04.909084 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:04Z","lastTransitionTime":"2026-02-19T19:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.012841 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.012898 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.012915 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.012937 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.012954 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:05Z","lastTransitionTime":"2026-02-19T19:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.042679 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 14:42:39.312512215 +0000 UTC Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.070795 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:05 crc kubenswrapper[4722]: E0219 19:19:05.070990 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.117048 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.117209 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.117230 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.117254 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.117270 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:05Z","lastTransitionTime":"2026-02-19T19:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.220751 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.220800 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.220817 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.220841 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.220860 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:05Z","lastTransitionTime":"2026-02-19T19:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.323860 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.324138 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.324251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.324332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.324403 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:05Z","lastTransitionTime":"2026-02-19T19:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.427424 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.427511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.427539 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.427569 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.427593 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:05Z","lastTransitionTime":"2026-02-19T19:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.530610 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.530961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.531059 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.531178 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.531276 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:05Z","lastTransitionTime":"2026-02-19T19:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.634008 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.634045 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.634056 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.634072 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.634084 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:05Z","lastTransitionTime":"2026-02-19T19:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.737050 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.737132 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.737226 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.737259 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.737295 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:05Z","lastTransitionTime":"2026-02-19T19:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.840658 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.840709 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.840720 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.840768 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.840782 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:05Z","lastTransitionTime":"2026-02-19T19:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.944392 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.944459 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.944475 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.944498 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:05 crc kubenswrapper[4722]: I0219 19:19:05.944515 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:05Z","lastTransitionTime":"2026-02-19T19:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.043513 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 03:28:33.136304715 +0000 UTC Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.047394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.047473 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.047499 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.047530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.047553 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:06Z","lastTransitionTime":"2026-02-19T19:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.071091 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.071184 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:06 crc kubenswrapper[4722]: E0219 19:19:06.071398 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:06 crc kubenswrapper[4722]: E0219 19:19:06.071566 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.071870 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:06 crc kubenswrapper[4722]: E0219 19:19:06.071982 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.150555 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.150626 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.150639 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.150659 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.150670 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:06Z","lastTransitionTime":"2026-02-19T19:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.254292 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.254341 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.254353 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.254370 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.254382 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:06Z","lastTransitionTime":"2026-02-19T19:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.356967 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.357014 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.357027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.357042 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.357053 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:06Z","lastTransitionTime":"2026-02-19T19:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.459988 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.460032 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.460043 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.460061 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.460075 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:06Z","lastTransitionTime":"2026-02-19T19:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.563807 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.563874 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.563896 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.563925 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.563949 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:06Z","lastTransitionTime":"2026-02-19T19:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.668262 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.668330 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.668354 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.668385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.668407 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:06Z","lastTransitionTime":"2026-02-19T19:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.770697 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.770746 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.770757 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.770772 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.770784 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:06Z","lastTransitionTime":"2026-02-19T19:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.873632 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.873740 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.873766 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.873802 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.873825 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:06Z","lastTransitionTime":"2026-02-19T19:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.977451 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.977525 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.977543 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.977987 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:06 crc kubenswrapper[4722]: I0219 19:19:06.978042 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:06Z","lastTransitionTime":"2026-02-19T19:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.044478 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 10:47:41.085538751 +0000 UTC Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.071084 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:07 crc kubenswrapper[4722]: E0219 19:19:07.071288 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.083925 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.084000 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.084027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.084073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.084093 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.186988 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.187224 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.187246 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.187270 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.187287 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.290521 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.290745 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.290852 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.290942 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.291021 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.394728 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.394759 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.394767 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.394780 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.394789 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.498431 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.498479 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.498497 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.498520 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.498539 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.602097 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.602189 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.602200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.602225 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.602275 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.617860 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.617935 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.617953 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.617979 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.618000 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: E0219 19:19:07.640035 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:07Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.646208 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.646273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.646285 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.646304 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.646742 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: E0219 19:19:07.663859 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:07Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.668204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.668270 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.668289 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.668314 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.668331 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: E0219 19:19:07.688820 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:07Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.693550 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.693622 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.693634 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.693674 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.693702 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: E0219 19:19:07.713113 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:07Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.717623 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.717783 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.717886 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.718038 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.718174 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: E0219 19:19:07.741721 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:07Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:07 crc kubenswrapper[4722]: E0219 19:19:07.742280 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.744129 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.744328 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.744471 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.744747 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.744959 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.848658 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.848750 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.848767 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.849475 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.849502 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.953957 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.954056 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.954076 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.954101 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:07 crc kubenswrapper[4722]: I0219 19:19:07.954123 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:07Z","lastTransitionTime":"2026-02-19T19:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.044617 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 11:27:15.451662328 +0000 UTC Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.057029 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.057093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.057123 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.057243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.057334 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:08Z","lastTransitionTime":"2026-02-19T19:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.070624 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.070696 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.070696 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:08 crc kubenswrapper[4722]: E0219 19:19:08.070788 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:08 crc kubenswrapper[4722]: E0219 19:19:08.071186 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:08 crc kubenswrapper[4722]: E0219 19:19:08.071051 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.160803 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.160886 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.160904 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.160987 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.161091 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:08Z","lastTransitionTime":"2026-02-19T19:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.265297 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.265956 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.266245 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.266431 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.266642 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:08Z","lastTransitionTime":"2026-02-19T19:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.369793 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.369867 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.369889 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.369926 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.369948 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:08Z","lastTransitionTime":"2026-02-19T19:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.473564 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.473646 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.473865 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.473897 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.473922 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:08Z","lastTransitionTime":"2026-02-19T19:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.577326 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.577436 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.577451 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.577469 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.577482 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:08Z","lastTransitionTime":"2026-02-19T19:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.679305 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.679374 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.679393 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.679420 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.679441 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:08Z","lastTransitionTime":"2026-02-19T19:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.782284 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.782348 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.782366 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.782391 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.782409 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:08Z","lastTransitionTime":"2026-02-19T19:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.885489 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.885551 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.885570 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.885597 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.885616 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:08Z","lastTransitionTime":"2026-02-19T19:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.988285 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.988343 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.988360 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.988384 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:08 crc kubenswrapper[4722]: I0219 19:19:08.988403 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:08Z","lastTransitionTime":"2026-02-19T19:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.045566 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 22:32:25.168409693 +0000 UTC Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.071100 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:09 crc kubenswrapper[4722]: E0219 19:19:09.071336 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.091384 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.091432 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.091450 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.091473 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.091491 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:09Z","lastTransitionTime":"2026-02-19T19:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.195083 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.195194 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.195235 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.195268 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.195295 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:09Z","lastTransitionTime":"2026-02-19T19:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.298770 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.298823 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.298831 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.298845 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.298856 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:09Z","lastTransitionTime":"2026-02-19T19:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.401367 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.401426 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.401442 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.401465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.401486 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:09Z","lastTransitionTime":"2026-02-19T19:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.504702 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.504812 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.504824 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.504844 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.504853 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:09Z","lastTransitionTime":"2026-02-19T19:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.608096 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.608212 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.608232 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.608261 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.608280 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:09Z","lastTransitionTime":"2026-02-19T19:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.711479 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.711550 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.711574 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.711604 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.711627 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:09Z","lastTransitionTime":"2026-02-19T19:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.815315 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.815358 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.815368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.815385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.815394 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:09Z","lastTransitionTime":"2026-02-19T19:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.919311 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.919372 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.919389 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.919413 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:09 crc kubenswrapper[4722]: I0219 19:19:09.919431 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:09Z","lastTransitionTime":"2026-02-19T19:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.023076 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.024068 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.024283 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.024446 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.024576 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:10Z","lastTransitionTime":"2026-02-19T19:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.046494 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 02:03:14.502098327 +0000 UTC Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.069635 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.070571 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.070682 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.070571 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:10 crc kubenswrapper[4722]: E0219 19:19:10.070743 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:10 crc kubenswrapper[4722]: E0219 19:19:10.070853 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:10 crc kubenswrapper[4722]: E0219 19:19:10.070945 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.072316 4722 scope.go:117] "RemoveContainer" containerID="64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.086867 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.089743 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.107881 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.120429 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.127524 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.127565 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.127577 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.127594 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.127607 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:10Z","lastTransitionTime":"2026-02-19T19:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.136822 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.153489 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.168702 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.186856 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.210564 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.227580 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.232721 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.232799 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.232887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.232920 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.232944 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:10Z","lastTransitionTime":"2026-02-19T19:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.250973 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.266575 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.284232 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.311380 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 19:18:53.334284 6191 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 2.488018ms)\\\\nI0219 19:18:53.334422 6191 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334476 6191 services_controller.go:453] Built service openshift-kube-scheduler-operator/metrics template LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334483 6191 services_controller.go:454] Service openshift-kube-scheduler-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0219 19:18:53.334406 6191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.327563 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.338235 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.338275 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.338286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.338306 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.338318 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:10Z","lastTransitionTime":"2026-02-19T19:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.342283 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.354573 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.408831 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/1.log" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.413477 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.415545 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.432868 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.440932 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.440983 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.441001 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.441023 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.441041 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:10Z","lastTransitionTime":"2026-02-19T19:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.448678 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.462734 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.472851 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.485654 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.499512 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.511875 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.526263 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.543435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.543476 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.543488 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.543505 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.543518 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:10Z","lastTransitionTime":"2026-02-19T19:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.544860 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.561317 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.577638 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.594176 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.608961 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.620539 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.639410 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 19:18:53.334284 6191 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 2.488018ms)\\\\nI0219 19:18:53.334422 6191 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334476 6191 services_controller.go:453] Built service openshift-kube-scheduler-operator/metrics template LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334483 6191 services_controller.go:454] Service openshift-kube-scheduler-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0219 19:18:53.334406 6191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.645221 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.645412 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.645538 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.645612 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.645675 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:10Z","lastTransitionTime":"2026-02-19T19:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.650891 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.662809 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:10Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.748376 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.748656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.748771 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.748854 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.748917 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:10Z","lastTransitionTime":"2026-02-19T19:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.851704 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.852071 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.852260 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.852396 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.852511 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:10Z","lastTransitionTime":"2026-02-19T19:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.956100 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.956188 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.956211 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.956240 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:10 crc kubenswrapper[4722]: I0219 19:19:10.956261 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:10Z","lastTransitionTime":"2026-02-19T19:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.021115 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:11 crc kubenswrapper[4722]: E0219 19:19:11.021278 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:19:11 crc kubenswrapper[4722]: E0219 19:19:11.021691 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs podName:493acad5-7300-4941-9311-19b3d5f21786 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:27.021670648 +0000 UTC m=+66.634020982 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs") pod "network-metrics-daemon-s6hhp" (UID: "493acad5-7300-4941-9311-19b3d5f21786") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.047126 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 12:15:53.9038052 +0000 UTC Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.058348 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.058372 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.058400 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.058414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.058423 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:11Z","lastTransitionTime":"2026-02-19T19:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.071111 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:11 crc kubenswrapper[4722]: E0219 19:19:11.071426 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.090072 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.105730 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.121051 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.133207 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.147625 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.159918 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.159956 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.159968 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.159984 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.159998 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:11Z","lastTransitionTime":"2026-02-19T19:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.165057 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.183668 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.199296 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.214323 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.228027 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.242095 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.263247 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.263304 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.263318 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.263336 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.263347 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:11Z","lastTransitionTime":"2026-02-19T19:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.264751 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.283815 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.297837 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.316941 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.348990 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 19:18:53.334284 6191 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 2.488018ms)\\\\nI0219 19:18:53.334422 6191 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334476 6191 services_controller.go:453] Built service openshift-kube-scheduler-operator/metrics template LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334483 6191 services_controller.go:454] Service openshift-kube-scheduler-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0219 19:18:53.334406 6191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.367224 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.367253 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.367320 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.367333 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.367348 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.367358 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:11Z","lastTransitionTime":"2026-02-19T19:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.423996 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/2.log" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.424964 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/1.log" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.427998 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a" exitCode=1 Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.428049 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.428095 4722 scope.go:117] "RemoveContainer" containerID="64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.428925 4722 scope.go:117] "RemoveContainer" containerID="841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a" Feb 19 19:19:11 crc kubenswrapper[4722]: E0219 19:19:11.429125 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.446231 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.458846 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.469656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.469720 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.469739 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.469763 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.469784 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:11Z","lastTransitionTime":"2026-02-19T19:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.474769 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.489485 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.500349 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.508894 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.517178 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.531826 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.544186 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.553493 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.564226 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.574387 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.575007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.575101 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.575196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.575272 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.575328 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:11Z","lastTransitionTime":"2026-02-19T19:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.586559 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.600750 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.622522 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64814bc2cb31a0d73577f67dcbc6984139585cc692029c4e99b970ca28874367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"message\\\":\\\"GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.110:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f9232b32-e89f-4c8e-acc4-c6801b70dcb0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0219 19:18:53.334284 6191 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 2.488018ms)\\\\nI0219 19:18:53.334422 6191 services_controller.go:452] Built service openshift-kube-scheduler-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334476 6191 services_controller.go:453] Built service openshift-kube-scheduler-operator/metrics template LB for network=default: []services.LB{}\\\\nI0219 19:18:53.334483 6191 services_controller.go:454] Service openshift-kube-scheduler-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0219 19:18:53.334406 6191 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:11Z\\\",\\\"message\\\":\\\"space event handler 5\\\\nI0219 19:19:11.030460 6395 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 19:19:11.030476 6395 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 19:19:11.030488 6395 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:19:11.030497 6395 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:19:11.030553 6395 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 19:19:11.030750 6395 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 19:19:11.031201 6395 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031258 6395 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031327 6395 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031451 6395 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.642320 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.655027 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:11Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.678381 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.678421 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.678430 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.678443 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.678452 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:11Z","lastTransitionTime":"2026-02-19T19:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.781251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.781302 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.781316 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.781333 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.781347 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:11Z","lastTransitionTime":"2026-02-19T19:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.883502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.883538 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.883547 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.883564 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.883573 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:11Z","lastTransitionTime":"2026-02-19T19:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.986092 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.986143 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.986189 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.986213 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:11 crc kubenswrapper[4722]: I0219 19:19:11.986230 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:11Z","lastTransitionTime":"2026-02-19T19:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.047582 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 23:28:33.964837636 +0000 UTC Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.071446 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.071626 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.072235 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.072349 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.072385 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.072511 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.089877 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.089929 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.089945 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.089965 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.089981 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:12Z","lastTransitionTime":"2026-02-19T19:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.192872 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.192929 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.192940 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.192954 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.192965 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:12Z","lastTransitionTime":"2026-02-19T19:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.232099 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.232344 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:19:44.232305372 +0000 UTC m=+83.844655706 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.295043 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.295082 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.295093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.295108 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.295121 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:12Z","lastTransitionTime":"2026-02-19T19:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.333539 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.333596 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.333624 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.333666 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.333791 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.333813 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.333826 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.333882 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:44.333866956 +0000 UTC m=+83.946217290 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.333873 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.333998 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.334116 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.334252 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.334008 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:44.333974549 +0000 UTC m=+83.946324903 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.333878 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.334476 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:44.334371672 +0000 UTC m=+83.946722036 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.334549 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:44.334530967 +0000 UTC m=+83.946881421 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.397828 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.398282 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.398301 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.398325 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.398342 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:12Z","lastTransitionTime":"2026-02-19T19:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.433781 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/2.log" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.437126 4722 scope.go:117] "RemoveContainer" containerID="841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a" Feb 19 19:19:12 crc kubenswrapper[4722]: E0219 19:19:12.437410 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.457532 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.484851 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:11Z\\\",\\\"message\\\":\\\"space event handler 5\\\\nI0219 19:19:11.030460 6395 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 19:19:11.030476 6395 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 19:19:11.030488 6395 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:19:11.030497 6395 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:19:11.030553 6395 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 19:19:11.030750 6395 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 19:19:11.031201 6395 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031258 6395 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031327 6395 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031451 6395 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.497482 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.501628 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.501678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.501698 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.501721 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.501738 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:12Z","lastTransitionTime":"2026-02-19T19:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.510488 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.520479 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.532553 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.548299 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.563830 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.575530 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.586736 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.601222 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.606225 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.606322 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.606345 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.606369 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.606386 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:12Z","lastTransitionTime":"2026-02-19T19:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.610940 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.622604 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.634656 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.644302 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.656072 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.669171 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:12Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.708714 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.708803 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.708827 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.708861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.708887 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:12Z","lastTransitionTime":"2026-02-19T19:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.813043 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.813190 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.813218 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.813244 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.813261 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:12Z","lastTransitionTime":"2026-02-19T19:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.916709 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.916771 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.916788 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.916811 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:12 crc kubenswrapper[4722]: I0219 19:19:12.916829 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:12Z","lastTransitionTime":"2026-02-19T19:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.019957 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.020043 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.020074 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.020104 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.020137 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:13Z","lastTransitionTime":"2026-02-19T19:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.048310 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 13:39:11.305739652 +0000 UTC Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.070982 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:13 crc kubenswrapper[4722]: E0219 19:19:13.071233 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.123233 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.123286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.123306 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.123330 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.123349 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:13Z","lastTransitionTime":"2026-02-19T19:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.225752 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.225816 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.225839 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.225866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.225888 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:13Z","lastTransitionTime":"2026-02-19T19:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.328056 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.328094 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.328103 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.328117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.328126 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:13Z","lastTransitionTime":"2026-02-19T19:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.431286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.431338 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.431350 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.431368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.431380 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:13Z","lastTransitionTime":"2026-02-19T19:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.534133 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.534189 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.534200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.534217 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.534229 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:13Z","lastTransitionTime":"2026-02-19T19:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.636741 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.636802 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.636818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.636837 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.636851 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:13Z","lastTransitionTime":"2026-02-19T19:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.739723 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.739765 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.739774 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.739788 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.739797 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:13Z","lastTransitionTime":"2026-02-19T19:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.843476 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.843517 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.843527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.843546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.843555 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:13Z","lastTransitionTime":"2026-02-19T19:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.945956 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.946003 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.946017 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.946036 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:13 crc kubenswrapper[4722]: I0219 19:19:13.946051 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:13Z","lastTransitionTime":"2026-02-19T19:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.048427 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 18:29:40.759575137 +0000 UTC Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.048732 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.048771 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.048787 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.048806 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.048821 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:14Z","lastTransitionTime":"2026-02-19T19:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.070661 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:14 crc kubenswrapper[4722]: E0219 19:19:14.070799 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.070657 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.070661 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:14 crc kubenswrapper[4722]: E0219 19:19:14.070893 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:14 crc kubenswrapper[4722]: E0219 19:19:14.071084 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.152273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.152332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.152341 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.152371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.152382 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:14Z","lastTransitionTime":"2026-02-19T19:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.256565 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.256647 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.256662 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.256679 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.256707 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:14Z","lastTransitionTime":"2026-02-19T19:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.360193 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.360233 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.360243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.360257 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.360267 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:14Z","lastTransitionTime":"2026-02-19T19:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.462849 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.462919 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.462932 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.462952 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.462964 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:14Z","lastTransitionTime":"2026-02-19T19:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.565773 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.565816 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.565826 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.565840 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.565849 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:14Z","lastTransitionTime":"2026-02-19T19:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.667813 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.667851 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.667861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.667874 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.667884 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:14Z","lastTransitionTime":"2026-02-19T19:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.770418 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.770470 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.770485 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.770503 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.770535 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:14Z","lastTransitionTime":"2026-02-19T19:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.872681 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.872727 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.872736 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.872748 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.872756 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:14Z","lastTransitionTime":"2026-02-19T19:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.974807 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.974851 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.974860 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.974876 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:14 crc kubenswrapper[4722]: I0219 19:19:14.974889 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:14Z","lastTransitionTime":"2026-02-19T19:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.049804 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 15:57:14.419190545 +0000 UTC Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.070622 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:15 crc kubenswrapper[4722]: E0219 19:19:15.070814 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.076971 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.077020 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.077037 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.077059 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.077078 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:15Z","lastTransitionTime":"2026-02-19T19:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.180022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.180098 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.180120 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.180180 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.180212 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:15Z","lastTransitionTime":"2026-02-19T19:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.282813 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.282866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.282878 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.282898 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.282910 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:15Z","lastTransitionTime":"2026-02-19T19:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.386123 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.386172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.386184 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.386199 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.386210 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:15Z","lastTransitionTime":"2026-02-19T19:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.488504 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.488631 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.488694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.488718 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.488772 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:15Z","lastTransitionTime":"2026-02-19T19:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.591594 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.591668 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.591697 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.591724 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.591745 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:15Z","lastTransitionTime":"2026-02-19T19:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.694288 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.694360 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.694382 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.694410 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.694432 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:15Z","lastTransitionTime":"2026-02-19T19:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.797780 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.797858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.797879 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.797902 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.797919 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:15Z","lastTransitionTime":"2026-02-19T19:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.900333 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.900394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.900411 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.900436 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:15 crc kubenswrapper[4722]: I0219 19:19:15.900450 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:15Z","lastTransitionTime":"2026-02-19T19:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.002848 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.002891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.002902 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.002920 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.002930 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:16Z","lastTransitionTime":"2026-02-19T19:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.050831 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 05:17:01.758381173 +0000 UTC Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.070437 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.070461 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:16 crc kubenswrapper[4722]: E0219 19:19:16.070580 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.070621 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:16 crc kubenswrapper[4722]: E0219 19:19:16.070674 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:16 crc kubenswrapper[4722]: E0219 19:19:16.070810 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.106129 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.106180 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.106192 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.106209 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.106218 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:16Z","lastTransitionTime":"2026-02-19T19:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.208640 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.208695 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.208736 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.208759 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.208775 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:16Z","lastTransitionTime":"2026-02-19T19:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.312114 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.312224 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.312245 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.312268 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.312285 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:16Z","lastTransitionTime":"2026-02-19T19:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.414905 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.414941 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.414950 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.414964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.414973 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:16Z","lastTransitionTime":"2026-02-19T19:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.518223 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.518275 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.518289 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.518307 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.518319 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:16Z","lastTransitionTime":"2026-02-19T19:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.621597 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.621671 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.621695 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.621725 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.621748 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:16Z","lastTransitionTime":"2026-02-19T19:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.724836 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.724899 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.724917 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.724941 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.724958 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:16Z","lastTransitionTime":"2026-02-19T19:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.830010 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.830093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.830120 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.830155 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.830224 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:16Z","lastTransitionTime":"2026-02-19T19:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.933568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.933624 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.933641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.933664 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:16 crc kubenswrapper[4722]: I0219 19:19:16.933682 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:16Z","lastTransitionTime":"2026-02-19T19:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.037213 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.037288 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.037308 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.037331 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.037347 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:17Z","lastTransitionTime":"2026-02-19T19:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.051487 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 23:40:33.946486748 +0000 UTC Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.070556 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:17 crc kubenswrapper[4722]: E0219 19:19:17.070733 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.140667 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.140719 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.140736 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.140758 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.140775 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:17Z","lastTransitionTime":"2026-02-19T19:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.244328 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.244420 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.244454 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.244486 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.244508 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:17Z","lastTransitionTime":"2026-02-19T19:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.348278 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.348339 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.348355 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.348379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.348396 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:17Z","lastTransitionTime":"2026-02-19T19:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.451680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.451734 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.451753 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.451777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.451794 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:17Z","lastTransitionTime":"2026-02-19T19:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.554648 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.554709 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.554727 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.554768 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.554807 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:17Z","lastTransitionTime":"2026-02-19T19:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.657891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.657959 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.657971 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.657987 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.657998 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:17Z","lastTransitionTime":"2026-02-19T19:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.760072 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.760127 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.760136 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.760151 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.760179 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:17Z","lastTransitionTime":"2026-02-19T19:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.862339 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.862478 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.862498 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.862582 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.862608 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:17Z","lastTransitionTime":"2026-02-19T19:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.965351 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.965380 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.965388 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.965401 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:17 crc kubenswrapper[4722]: I0219 19:19:17.965410 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:17Z","lastTransitionTime":"2026-02-19T19:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.051678 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 15:53:46.342556142 +0000 UTC Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.068278 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.068313 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.068325 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.068339 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.068348 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.070495 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:18 crc kubenswrapper[4722]: E0219 19:19:18.070628 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.070789 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:18 crc kubenswrapper[4722]: E0219 19:19:18.070840 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.070937 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:18 crc kubenswrapper[4722]: E0219 19:19:18.070988 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.140784 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.140831 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.140843 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.140859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.140872 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: E0219 19:19:18.155415 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:18Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.159629 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.159667 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.159678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.159696 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.159709 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: E0219 19:19:18.171250 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:18Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.174836 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.174870 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.174900 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.174913 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.174921 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: E0219 19:19:18.186650 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:18Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.190818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.190866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.190875 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.190891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.190900 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: E0219 19:19:18.203898 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:18Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.207493 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.207529 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.207538 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.207554 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.207563 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: E0219 19:19:18.219969 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:18Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:18 crc kubenswrapper[4722]: E0219 19:19:18.220184 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.221960 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.221987 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.221998 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.222012 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.222022 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.324813 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.324867 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.324887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.324910 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.324929 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.427377 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.427439 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.427457 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.427484 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.427501 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.531257 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.531324 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.531342 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.531366 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.531384 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.634838 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.634928 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.634948 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.634975 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.635022 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.738707 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.739118 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.739318 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.739462 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.739585 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.841746 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.841815 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.841839 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.841867 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.841888 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.944370 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.944610 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.944696 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.944780 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:18 crc kubenswrapper[4722]: I0219 19:19:18.944845 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:18Z","lastTransitionTime":"2026-02-19T19:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.047208 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.047251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.047260 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.047273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.047283 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:19Z","lastTransitionTime":"2026-02-19T19:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.052591 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 18:44:00.412904423 +0000 UTC Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.070924 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:19 crc kubenswrapper[4722]: E0219 19:19:19.071103 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.150345 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.150396 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.150407 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.150425 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.150435 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:19Z","lastTransitionTime":"2026-02-19T19:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.252856 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.252903 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.252914 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.252932 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.252944 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:19Z","lastTransitionTime":"2026-02-19T19:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.356428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.356494 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.356516 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.356546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.356568 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:19Z","lastTransitionTime":"2026-02-19T19:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.459424 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.459474 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.459494 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.459516 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.459532 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:19Z","lastTransitionTime":"2026-02-19T19:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.563592 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.563668 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.563693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.563721 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.563742 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:19Z","lastTransitionTime":"2026-02-19T19:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.666587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.666641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.666659 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.666682 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.666701 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:19Z","lastTransitionTime":"2026-02-19T19:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.769450 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.769511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.769526 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.769546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.769558 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:19Z","lastTransitionTime":"2026-02-19T19:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.872892 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.872940 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.872948 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.872961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.872971 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:19Z","lastTransitionTime":"2026-02-19T19:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.977343 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.977408 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.977426 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.977500 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:19 crc kubenswrapper[4722]: I0219 19:19:19.977521 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:19Z","lastTransitionTime":"2026-02-19T19:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.053520 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 12:41:03.878988016 +0000 UTC Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.070346 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:20 crc kubenswrapper[4722]: E0219 19:19:20.070718 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.070822 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.070812 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:20 crc kubenswrapper[4722]: E0219 19:19:20.070969 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:20 crc kubenswrapper[4722]: E0219 19:19:20.071183 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.080235 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.080293 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.080314 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.080339 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.080358 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:20Z","lastTransitionTime":"2026-02-19T19:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.183777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.183831 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.183848 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.183869 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.183885 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:20Z","lastTransitionTime":"2026-02-19T19:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.287254 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.287340 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.287353 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.287369 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.287381 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:20Z","lastTransitionTime":"2026-02-19T19:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.390285 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.390347 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.390367 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.390429 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.390448 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:20Z","lastTransitionTime":"2026-02-19T19:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.493928 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.493981 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.494003 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.494033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.494055 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:20Z","lastTransitionTime":"2026-02-19T19:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.596333 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.596365 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.596373 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.596385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.596393 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:20Z","lastTransitionTime":"2026-02-19T19:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.699420 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.699501 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.699519 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.699543 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.699563 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:20Z","lastTransitionTime":"2026-02-19T19:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.802520 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.802599 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.802623 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.802656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.802680 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:20Z","lastTransitionTime":"2026-02-19T19:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.906068 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.906143 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.906201 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.906230 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:20 crc kubenswrapper[4722]: I0219 19:19:20.906251 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:20Z","lastTransitionTime":"2026-02-19T19:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.010129 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.010187 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.010196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.010208 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.010219 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:21Z","lastTransitionTime":"2026-02-19T19:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.054088 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 02:18:29.741901914 +0000 UTC Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.070678 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:21 crc kubenswrapper[4722]: E0219 19:19:21.070877 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.095779 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.113124 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.113488 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.114016 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.114526 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.114941 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:21Z","lastTransitionTime":"2026-02-19T19:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.113894 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.130872 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.145580 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.160159 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.180194 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.190980 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.203904 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.219448 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.219503 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.219527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.219551 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.219569 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:21Z","lastTransitionTime":"2026-02-19T19:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.220082 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.234751 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.246836 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.267008 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.280838 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.296968 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.318135 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:11Z\\\",\\\"message\\\":\\\"space event handler 5\\\\nI0219 19:19:11.030460 6395 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 19:19:11.030476 6395 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 19:19:11.030488 6395 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:19:11.030497 6395 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:19:11.030553 6395 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 19:19:11.030750 6395 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 19:19:11.031201 6395 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031258 6395 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031327 6395 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031451 6395 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.322113 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.322197 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.322216 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.322239 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.322260 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:21Z","lastTransitionTime":"2026-02-19T19:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.337817 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.353257 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:21Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.424483 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.424808 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.424932 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.425048 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.425138 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:21Z","lastTransitionTime":"2026-02-19T19:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.528064 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.528140 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.528174 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.528192 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.528206 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:21Z","lastTransitionTime":"2026-02-19T19:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.631560 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.631588 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.631599 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.631611 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.631619 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:21Z","lastTransitionTime":"2026-02-19T19:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.734741 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.734811 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.734833 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.734861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.734883 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:21Z","lastTransitionTime":"2026-02-19T19:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.837620 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.837755 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.837770 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.837787 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.837800 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:21Z","lastTransitionTime":"2026-02-19T19:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.941086 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.941194 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.941222 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.941255 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:21 crc kubenswrapper[4722]: I0219 19:19:21.941278 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:21Z","lastTransitionTime":"2026-02-19T19:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.044123 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.044178 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.044187 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.044200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.044210 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:22Z","lastTransitionTime":"2026-02-19T19:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.054581 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 04:05:54.162692732 +0000 UTC Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.071101 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.071204 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:22 crc kubenswrapper[4722]: E0219 19:19:22.071224 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:22 crc kubenswrapper[4722]: E0219 19:19:22.071374 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.071767 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:22 crc kubenswrapper[4722]: E0219 19:19:22.072093 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.146852 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.146936 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.146960 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.146988 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.147015 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:22Z","lastTransitionTime":"2026-02-19T19:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.249320 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.249353 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.249362 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.249376 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.249385 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:22Z","lastTransitionTime":"2026-02-19T19:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.351285 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.351332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.351344 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.351358 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.351369 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:22Z","lastTransitionTime":"2026-02-19T19:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.454441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.454517 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.454530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.454550 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.454565 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:22Z","lastTransitionTime":"2026-02-19T19:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.557845 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.557909 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.557922 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.557942 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.557955 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:22Z","lastTransitionTime":"2026-02-19T19:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.661532 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.661566 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.661576 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.661592 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.661600 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:22Z","lastTransitionTime":"2026-02-19T19:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.764496 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.764637 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.764663 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.764694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.764712 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:22Z","lastTransitionTime":"2026-02-19T19:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.867567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.867617 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.867635 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.867658 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.867676 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:22Z","lastTransitionTime":"2026-02-19T19:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.970795 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.970840 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.970849 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.970864 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:22 crc kubenswrapper[4722]: I0219 19:19:22.970873 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:22Z","lastTransitionTime":"2026-02-19T19:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.055191 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 20:36:53.755172638 +0000 UTC Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.071434 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:23 crc kubenswrapper[4722]: E0219 19:19:23.071650 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.073440 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.073486 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.073505 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.073528 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.073544 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:23Z","lastTransitionTime":"2026-02-19T19:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.176789 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.176843 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.176857 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.176877 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.176898 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:23Z","lastTransitionTime":"2026-02-19T19:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.279677 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.280046 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.280220 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.280384 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.280525 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:23Z","lastTransitionTime":"2026-02-19T19:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.384051 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.384422 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.384562 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.384693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.384826 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:23Z","lastTransitionTime":"2026-02-19T19:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.487380 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.487698 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.487840 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.487993 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.488252 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:23Z","lastTransitionTime":"2026-02-19T19:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.590951 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.591444 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.591591 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.591766 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.592672 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:23Z","lastTransitionTime":"2026-02-19T19:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.696289 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.696678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.696864 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.697046 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.697467 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:23Z","lastTransitionTime":"2026-02-19T19:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.800969 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.801332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.801546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.801723 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.801892 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:23Z","lastTransitionTime":"2026-02-19T19:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.905226 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.905938 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.906049 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.906180 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:23 crc kubenswrapper[4722]: I0219 19:19:23.906294 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:23Z","lastTransitionTime":"2026-02-19T19:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.009600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.010309 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.015588 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.015836 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.016054 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:24Z","lastTransitionTime":"2026-02-19T19:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.055938 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 08:04:12.153452002 +0000 UTC Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.070679 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:24 crc kubenswrapper[4722]: E0219 19:19:24.071248 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.071052 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:24 crc kubenswrapper[4722]: E0219 19:19:24.072473 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.071010 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:24 crc kubenswrapper[4722]: E0219 19:19:24.073283 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.119992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.120228 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.120376 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.120504 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.120619 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:24Z","lastTransitionTime":"2026-02-19T19:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.224323 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.224723 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.224887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.225060 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.225269 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:24Z","lastTransitionTime":"2026-02-19T19:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.328174 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.328234 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.328249 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.328271 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.328286 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:24Z","lastTransitionTime":"2026-02-19T19:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.430791 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.430857 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.430870 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.430892 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.430908 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:24Z","lastTransitionTime":"2026-02-19T19:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.533837 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.533896 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.533913 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.533937 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.533954 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:24Z","lastTransitionTime":"2026-02-19T19:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.635924 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.635975 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.635989 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.636007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.636021 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:24Z","lastTransitionTime":"2026-02-19T19:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.738616 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.738655 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.738664 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.738678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.738687 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:24Z","lastTransitionTime":"2026-02-19T19:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.841912 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.841954 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.841964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.841981 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.841993 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:24Z","lastTransitionTime":"2026-02-19T19:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.944496 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.944572 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.944590 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.944618 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:24 crc kubenswrapper[4722]: I0219 19:19:24.944635 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:24Z","lastTransitionTime":"2026-02-19T19:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.047259 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.047296 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.047304 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.047320 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.047330 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:25Z","lastTransitionTime":"2026-02-19T19:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.056397 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 09:19:33.135184471 +0000 UTC Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.070799 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:25 crc kubenswrapper[4722]: E0219 19:19:25.070923 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.149464 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.149502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.149516 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.149529 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.149539 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:25Z","lastTransitionTime":"2026-02-19T19:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.252093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.252147 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.252196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.252219 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.252239 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:25Z","lastTransitionTime":"2026-02-19T19:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.356742 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.357321 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.357377 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.357400 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.357415 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:25Z","lastTransitionTime":"2026-02-19T19:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.463520 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.463567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.463579 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.463597 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.463626 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:25Z","lastTransitionTime":"2026-02-19T19:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.566525 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.566558 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.566567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.566581 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.566590 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:25Z","lastTransitionTime":"2026-02-19T19:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.669352 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.669407 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.669423 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.669449 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.669466 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:25Z","lastTransitionTime":"2026-02-19T19:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.772133 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.772184 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.772195 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.772210 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.772220 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:25Z","lastTransitionTime":"2026-02-19T19:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.874533 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.874561 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.874568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.874581 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.874591 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:25Z","lastTransitionTime":"2026-02-19T19:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.977274 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.977307 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.977317 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.977332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:25 crc kubenswrapper[4722]: I0219 19:19:25.977341 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:25Z","lastTransitionTime":"2026-02-19T19:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.056520 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 16:45:29.938180411 +0000 UTC Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.070887 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.071002 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:26 crc kubenswrapper[4722]: E0219 19:19:26.071157 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.071217 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:26 crc kubenswrapper[4722]: E0219 19:19:26.071505 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:26 crc kubenswrapper[4722]: E0219 19:19:26.071679 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.071900 4722 scope.go:117] "RemoveContainer" containerID="841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a" Feb 19 19:19:26 crc kubenswrapper[4722]: E0219 19:19:26.072069 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.079484 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.079512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.079523 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.079535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.079545 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:26Z","lastTransitionTime":"2026-02-19T19:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.182622 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.182669 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.182681 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.182703 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.182715 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:26Z","lastTransitionTime":"2026-02-19T19:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.295702 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.295749 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.295761 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.295785 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.295834 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:26Z","lastTransitionTime":"2026-02-19T19:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.398023 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.398063 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.398074 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.398092 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.398102 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:26Z","lastTransitionTime":"2026-02-19T19:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.499848 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.499890 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.499900 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.499914 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.499925 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:26Z","lastTransitionTime":"2026-02-19T19:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.602246 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.602286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.602298 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.602314 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.602325 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:26Z","lastTransitionTime":"2026-02-19T19:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.704636 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.704693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.704710 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.704732 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.704753 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:26Z","lastTransitionTime":"2026-02-19T19:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.807935 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.807981 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.807998 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.808022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.808040 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:26Z","lastTransitionTime":"2026-02-19T19:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.911310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.911375 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.911396 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.911425 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:26 crc kubenswrapper[4722]: I0219 19:19:26.911443 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:26Z","lastTransitionTime":"2026-02-19T19:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.017682 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.017774 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.017799 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.017832 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.017855 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:27Z","lastTransitionTime":"2026-02-19T19:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.057395 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 05:23:48.335135314 +0000 UTC Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.071289 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:27 crc kubenswrapper[4722]: E0219 19:19:27.071481 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.097020 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:27 crc kubenswrapper[4722]: E0219 19:19:27.097235 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:19:27 crc kubenswrapper[4722]: E0219 19:19:27.097321 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs podName:493acad5-7300-4941-9311-19b3d5f21786 nodeName:}" failed. No retries permitted until 2026-02-19 19:19:59.09729572 +0000 UTC m=+98.709646074 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs") pod "network-metrics-daemon-s6hhp" (UID: "493acad5-7300-4941-9311-19b3d5f21786") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.120517 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.120544 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.120552 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.120564 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.120573 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:27Z","lastTransitionTime":"2026-02-19T19:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.224053 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.224108 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.224126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.224148 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.224213 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:27Z","lastTransitionTime":"2026-02-19T19:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.326277 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.326347 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.326371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.326397 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.326415 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:27Z","lastTransitionTime":"2026-02-19T19:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.429096 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.429142 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.429186 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.429203 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.429213 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:27Z","lastTransitionTime":"2026-02-19T19:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.531441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.531503 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.531514 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.531531 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.531541 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:27Z","lastTransitionTime":"2026-02-19T19:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.633551 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.633619 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.633637 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.633662 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.633680 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:27Z","lastTransitionTime":"2026-02-19T19:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.735487 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.735543 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.735559 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.735585 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.735606 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:27Z","lastTransitionTime":"2026-02-19T19:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.837801 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.837841 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.837852 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.837885 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.837896 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:27Z","lastTransitionTime":"2026-02-19T19:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.941445 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.941548 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.941608 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.941638 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:27 crc kubenswrapper[4722]: I0219 19:19:27.941696 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:27Z","lastTransitionTime":"2026-02-19T19:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.044765 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.044859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.044877 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.044936 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.044960 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.057927 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 23:02:33.407806425 +0000 UTC Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.070279 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.070304 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.070278 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:28 crc kubenswrapper[4722]: E0219 19:19:28.070412 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:28 crc kubenswrapper[4722]: E0219 19:19:28.070527 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:28 crc kubenswrapper[4722]: E0219 19:19:28.070622 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.147280 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.147318 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.147329 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.147344 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.147357 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.249878 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.249923 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.249935 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.249951 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.249961 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.352076 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.352118 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.352129 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.352145 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.352172 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.454844 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.454891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.454902 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.454917 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.454930 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.557345 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.557380 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.557407 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.557420 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.557428 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.621195 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.621251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.621265 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.621286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.621301 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: E0219 19:19:28.633676 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:28Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.637050 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.637082 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.637091 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.637135 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.637146 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: E0219 19:19:28.648686 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:28Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.651513 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.651528 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.651536 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.651546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.651553 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: E0219 19:19:28.662278 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:28Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.665170 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.665217 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.665227 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.665243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.665257 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: E0219 19:19:28.677784 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:28Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.680965 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.681020 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.681032 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.681047 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.681057 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: E0219 19:19:28.692846 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:28Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:28 crc kubenswrapper[4722]: E0219 19:19:28.692954 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.694370 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.694407 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.694422 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.694438 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.694452 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.796704 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.796745 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.796754 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.796770 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.796780 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.898892 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.898972 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.898992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.899019 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:28 crc kubenswrapper[4722]: I0219 19:19:28.899038 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:28Z","lastTransitionTime":"2026-02-19T19:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.001219 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.001286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.001309 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.001339 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.001359 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:29Z","lastTransitionTime":"2026-02-19T19:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.058344 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 10:37:06.326519749 +0000 UTC Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.071182 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:29 crc kubenswrapper[4722]: E0219 19:19:29.071423 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.103460 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.103520 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.103537 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.103561 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.103579 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:29Z","lastTransitionTime":"2026-02-19T19:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.206718 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.206778 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.206796 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.206820 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.206839 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:29Z","lastTransitionTime":"2026-02-19T19:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.309828 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.309888 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.309906 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.309928 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.309945 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:29Z","lastTransitionTime":"2026-02-19T19:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.413048 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.413084 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.413101 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.413116 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.413126 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:29Z","lastTransitionTime":"2026-02-19T19:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.505094 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/0.log" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.505212 4722 generic.go:334] "Generic (PLEG): container finished" podID="7a80fcd7-8ac4-4e82-8f14-93d225898bb5" containerID="5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877" exitCode=1 Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.505252 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jnvgg" event={"ID":"7a80fcd7-8ac4-4e82-8f14-93d225898bb5","Type":"ContainerDied","Data":"5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.505816 4722 scope.go:117] "RemoveContainer" containerID="5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.516465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.516568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.516586 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.516608 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.516627 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:29Z","lastTransitionTime":"2026-02-19T19:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.518190 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.534667 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.547526 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.562098 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.579326 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.591699 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.611915 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.619450 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.619480 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.619490 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.619504 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.619516 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:29Z","lastTransitionTime":"2026-02-19T19:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.629109 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.657205 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:11Z\\\",\\\"message\\\":\\\"space event handler 5\\\\nI0219 19:19:11.030460 6395 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 19:19:11.030476 6395 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 19:19:11.030488 6395 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:19:11.030497 6395 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:19:11.030553 6395 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 19:19:11.030750 6395 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 19:19:11.031201 6395 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031258 6395 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031327 6395 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031451 6395 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.671782 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.683217 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.694929 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.707746 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.718132 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.721419 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.721451 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.721461 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.721475 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.721485 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:29Z","lastTransitionTime":"2026-02-19T19:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.728197 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.737753 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.748402 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"2026-02-19T19:18:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e\\\\n2026-02-19T19:18:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e to /host/opt/cni/bin/\\\\n2026-02-19T19:18:43Z [verbose] multus-daemon started\\\\n2026-02-19T19:18:43Z [verbose] Readiness Indicator file check\\\\n2026-02-19T19:19:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:29Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.823887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.823924 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.823936 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.823950 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.823962 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:29Z","lastTransitionTime":"2026-02-19T19:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.925554 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.925693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.925778 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.925853 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:29 crc kubenswrapper[4722]: I0219 19:19:29.925922 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:29Z","lastTransitionTime":"2026-02-19T19:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.027599 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.027641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.027649 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.027662 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.027671 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:30Z","lastTransitionTime":"2026-02-19T19:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.058898 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 20:30:00.302078097 +0000 UTC Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.070841 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.070878 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.070937 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:30 crc kubenswrapper[4722]: E0219 19:19:30.071083 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:30 crc kubenswrapper[4722]: E0219 19:19:30.071212 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:30 crc kubenswrapper[4722]: E0219 19:19:30.071372 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.129428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.129463 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.129472 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.129486 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.129495 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:30Z","lastTransitionTime":"2026-02-19T19:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.231667 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.231724 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.231735 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.231760 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.231771 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:30Z","lastTransitionTime":"2026-02-19T19:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.334134 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.334669 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.334873 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.335033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.335238 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:30Z","lastTransitionTime":"2026-02-19T19:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.438990 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.439044 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.439058 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.439078 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.439093 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:30Z","lastTransitionTime":"2026-02-19T19:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.510967 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/0.log" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.511418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jnvgg" event={"ID":"7a80fcd7-8ac4-4e82-8f14-93d225898bb5","Type":"ContainerStarted","Data":"38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.528707 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.541368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.541458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.541476 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.541501 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.541521 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:30Z","lastTransitionTime":"2026-02-19T19:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.549629 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.567089 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.583040 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.595544 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.608032 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"2026-02-19T19:18:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e\\\\n2026-02-19T19:18:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e to /host/opt/cni/bin/\\\\n2026-02-19T19:18:43Z [verbose] multus-daemon started\\\\n2026-02-19T19:18:43Z [verbose] Readiness Indicator file check\\\\n2026-02-19T19:19:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.620666 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.634697 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.643983 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.644030 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.644042 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.644061 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.644075 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:30Z","lastTransitionTime":"2026-02-19T19:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.647049 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.658209 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.673701 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.686565 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.700221 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.714448 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.737465 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:11Z\\\",\\\"message\\\":\\\"space event handler 5\\\\nI0219 19:19:11.030460 6395 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 19:19:11.030476 6395 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 19:19:11.030488 6395 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:19:11.030497 6395 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:19:11.030553 6395 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 19:19:11.030750 6395 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 19:19:11.031201 6395 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031258 6395 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031327 6395 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031451 6395 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.747134 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.747176 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.747185 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.747220 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.747231 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:30Z","lastTransitionTime":"2026-02-19T19:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.751379 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.769495 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:30Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.849648 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.849693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.849705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.849722 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.849733 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:30Z","lastTransitionTime":"2026-02-19T19:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.951566 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.951628 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.951644 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.951661 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:30 crc kubenswrapper[4722]: I0219 19:19:30.951673 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:30Z","lastTransitionTime":"2026-02-19T19:19:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.053834 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.053882 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.053894 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.053916 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.053932 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:31Z","lastTransitionTime":"2026-02-19T19:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.059008 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 09:54:21.377737464 +0000 UTC Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.070967 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:31 crc kubenswrapper[4722]: E0219 19:19:31.071084 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.086912 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.100233 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.112394 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.123762 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.137677 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"2026-02-19T19:18:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e\\\\n2026-02-19T19:18:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e to /host/opt/cni/bin/\\\\n2026-02-19T19:18:43Z [verbose] multus-daemon started\\\\n2026-02-19T19:18:43Z [verbose] Readiness Indicator file check\\\\n2026-02-19T19:19:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.149355 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.156011 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.156050 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.156060 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.156073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.156083 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:31Z","lastTransitionTime":"2026-02-19T19:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.163234 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.175114 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.186110 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.199561 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.212697 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.223380 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.235918 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.248734 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.258583 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.258618 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.258629 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.258646 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.258656 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:31Z","lastTransitionTime":"2026-02-19T19:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.273874 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:11Z\\\",\\\"message\\\":\\\"space event handler 5\\\\nI0219 19:19:11.030460 6395 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 19:19:11.030476 6395 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 19:19:11.030488 6395 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:19:11.030497 6395 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:19:11.030553 6395 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 19:19:11.030750 6395 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 19:19:11.031201 6395 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031258 6395 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031327 6395 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031451 6395 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.285299 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.296791 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:31Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.361177 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.361214 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.361222 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.361236 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.361245 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:31Z","lastTransitionTime":"2026-02-19T19:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.464202 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.464280 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.464305 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.464336 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.464359 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:31Z","lastTransitionTime":"2026-02-19T19:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.567363 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.567388 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.567397 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.567410 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.567418 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:31Z","lastTransitionTime":"2026-02-19T19:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.669534 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.669562 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.669571 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.669583 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.669593 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:31Z","lastTransitionTime":"2026-02-19T19:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.772171 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.772204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.772213 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.772225 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.772233 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:31Z","lastTransitionTime":"2026-02-19T19:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.874893 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.874932 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.874940 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.874955 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.874965 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:31Z","lastTransitionTime":"2026-02-19T19:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.977128 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.977197 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.977208 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.977224 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:31 crc kubenswrapper[4722]: I0219 19:19:31.977235 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:31Z","lastTransitionTime":"2026-02-19T19:19:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.059198 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 07:54:46.073560715 +0000 UTC Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.070596 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.070650 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.070710 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:32 crc kubenswrapper[4722]: E0219 19:19:32.070735 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:32 crc kubenswrapper[4722]: E0219 19:19:32.070802 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:32 crc kubenswrapper[4722]: E0219 19:19:32.070873 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.080007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.080044 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.080056 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.080071 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.080082 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:32Z","lastTransitionTime":"2026-02-19T19:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.182091 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.182136 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.182162 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.182180 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.182193 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:32Z","lastTransitionTime":"2026-02-19T19:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.284576 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.284625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.284638 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.284654 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.284667 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:32Z","lastTransitionTime":"2026-02-19T19:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.387641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.387725 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.387734 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.387751 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.387762 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:32Z","lastTransitionTime":"2026-02-19T19:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.490474 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.490522 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.490533 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.490551 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.490563 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:32Z","lastTransitionTime":"2026-02-19T19:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.593314 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.593349 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.593361 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.593382 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.593393 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:32Z","lastTransitionTime":"2026-02-19T19:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.696647 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.696688 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.696697 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.696711 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.696720 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:32Z","lastTransitionTime":"2026-02-19T19:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.798861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.798890 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.798899 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.798914 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.798922 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:32Z","lastTransitionTime":"2026-02-19T19:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.901964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.902019 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.902036 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.902058 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:32 crc kubenswrapper[4722]: I0219 19:19:32.902075 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:32Z","lastTransitionTime":"2026-02-19T19:19:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.004139 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.004233 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.004251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.004273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.004290 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:33Z","lastTransitionTime":"2026-02-19T19:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.059573 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 21:15:55.834092976 +0000 UTC Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.071045 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:33 crc kubenswrapper[4722]: E0219 19:19:33.071218 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.107033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.107079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.107092 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.107110 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.107122 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:33Z","lastTransitionTime":"2026-02-19T19:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.209449 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.209484 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.209492 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.209506 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.209515 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:33Z","lastTransitionTime":"2026-02-19T19:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.312059 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.312092 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.312102 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.312117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.312127 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:33Z","lastTransitionTime":"2026-02-19T19:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.414179 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.414207 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.414217 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.414231 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.414240 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:33Z","lastTransitionTime":"2026-02-19T19:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.516582 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.516646 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.516673 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.516704 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.516726 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:33Z","lastTransitionTime":"2026-02-19T19:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.618469 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.618505 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.618514 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.618527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.618536 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:33Z","lastTransitionTime":"2026-02-19T19:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.720456 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.720501 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.720514 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.720530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.720540 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:33Z","lastTransitionTime":"2026-02-19T19:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.823322 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.823372 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.823385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.823402 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.823415 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:33Z","lastTransitionTime":"2026-02-19T19:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.926000 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.926046 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.926059 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.926073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:33 crc kubenswrapper[4722]: I0219 19:19:33.926082 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:33Z","lastTransitionTime":"2026-02-19T19:19:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.028651 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.028694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.028705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.028722 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.028733 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:34Z","lastTransitionTime":"2026-02-19T19:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.060139 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 20:45:04.816004198 +0000 UTC Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.070802 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.070847 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.070903 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:34 crc kubenswrapper[4722]: E0219 19:19:34.071007 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:34 crc kubenswrapper[4722]: E0219 19:19:34.071180 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:34 crc kubenswrapper[4722]: E0219 19:19:34.071254 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.130679 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.130735 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.130749 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.130767 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.130778 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:34Z","lastTransitionTime":"2026-02-19T19:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.234091 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.234172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.234189 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.234207 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.234218 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:34Z","lastTransitionTime":"2026-02-19T19:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.336901 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.336952 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.336963 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.336981 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.336994 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:34Z","lastTransitionTime":"2026-02-19T19:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.439497 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.439542 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.439556 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.439578 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.439594 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:34Z","lastTransitionTime":"2026-02-19T19:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.541765 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.541837 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.541859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.541890 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.541913 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:34Z","lastTransitionTime":"2026-02-19T19:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.643915 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.644008 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.644033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.644105 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.644130 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:34Z","lastTransitionTime":"2026-02-19T19:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.749498 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.749568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.749581 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.749597 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.749609 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:34Z","lastTransitionTime":"2026-02-19T19:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.852131 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.852213 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.852226 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.852264 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.852274 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:34Z","lastTransitionTime":"2026-02-19T19:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.955302 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.955361 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.955373 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.955388 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:34 crc kubenswrapper[4722]: I0219 19:19:34.955402 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:34Z","lastTransitionTime":"2026-02-19T19:19:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.057945 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.057991 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.058003 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.058022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.058035 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:35Z","lastTransitionTime":"2026-02-19T19:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.061144 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 18:06:52.48138202 +0000 UTC Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.070680 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:35 crc kubenswrapper[4722]: E0219 19:19:35.070849 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.160087 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.160144 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.160195 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.160220 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.160233 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:35Z","lastTransitionTime":"2026-02-19T19:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.262785 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.262848 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.262862 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.262892 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.262965 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:35Z","lastTransitionTime":"2026-02-19T19:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.365763 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.365827 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.365844 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.365867 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.365886 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:35Z","lastTransitionTime":"2026-02-19T19:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.469004 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.469054 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.469064 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.469079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.469089 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:35Z","lastTransitionTime":"2026-02-19T19:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.572706 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.572777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.572795 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.572819 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.572837 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:35Z","lastTransitionTime":"2026-02-19T19:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.676264 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.676372 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.676395 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.676452 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.676569 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:35Z","lastTransitionTime":"2026-02-19T19:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.779899 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.780041 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.780066 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.780128 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.780197 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:35Z","lastTransitionTime":"2026-02-19T19:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.883602 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.883657 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.883669 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.883688 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.883698 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:35Z","lastTransitionTime":"2026-02-19T19:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.986543 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.986625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.986648 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.986680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:35 crc kubenswrapper[4722]: I0219 19:19:35.986704 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:35Z","lastTransitionTime":"2026-02-19T19:19:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.062272 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 00:53:17.349207934 +0000 UTC Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.070812 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.070827 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.070833 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:36 crc kubenswrapper[4722]: E0219 19:19:36.071239 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:36 crc kubenswrapper[4722]: E0219 19:19:36.071034 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:36 crc kubenswrapper[4722]: E0219 19:19:36.071369 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.088820 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.088871 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.088883 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.088901 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.088914 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:36Z","lastTransitionTime":"2026-02-19T19:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.191734 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.191778 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.191789 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.191806 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.191819 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:36Z","lastTransitionTime":"2026-02-19T19:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.294656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.294694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.294704 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.294719 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.294730 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:36Z","lastTransitionTime":"2026-02-19T19:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.401015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.401082 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.401105 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.401526 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.401558 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:36Z","lastTransitionTime":"2026-02-19T19:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.504678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.504735 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.504752 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.504779 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.504827 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:36Z","lastTransitionTime":"2026-02-19T19:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.607458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.607526 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.607548 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.607575 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.607598 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:36Z","lastTransitionTime":"2026-02-19T19:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.710256 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.710363 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.710381 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.710397 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.710411 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:36Z","lastTransitionTime":"2026-02-19T19:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.814013 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.814051 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.814062 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.814078 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.814090 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:36Z","lastTransitionTime":"2026-02-19T19:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.917028 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.917102 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.917125 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.917207 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:36 crc kubenswrapper[4722]: I0219 19:19:36.917226 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:36Z","lastTransitionTime":"2026-02-19T19:19:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.020059 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.020116 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.020135 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.020204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.020232 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:37Z","lastTransitionTime":"2026-02-19T19:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.062630 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 22:07:21.478431855 +0000 UTC Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.071550 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:37 crc kubenswrapper[4722]: E0219 19:19:37.071761 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.084031 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.122865 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.122913 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.122925 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.122942 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.122952 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:37Z","lastTransitionTime":"2026-02-19T19:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.226336 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.226382 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.226394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.226411 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.226431 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:37Z","lastTransitionTime":"2026-02-19T19:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.329507 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.329580 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.329604 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.329630 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.329647 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:37Z","lastTransitionTime":"2026-02-19T19:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.432401 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.432439 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.432449 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.432465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.432477 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:37Z","lastTransitionTime":"2026-02-19T19:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.534875 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.534925 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.534936 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.534954 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.534967 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:37Z","lastTransitionTime":"2026-02-19T19:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.637972 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.638003 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.638014 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.638028 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.638038 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:37Z","lastTransitionTime":"2026-02-19T19:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.741371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.741435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.741455 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.741480 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.741504 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:37Z","lastTransitionTime":"2026-02-19T19:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.844454 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.844515 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.844535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.844559 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.844577 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:37Z","lastTransitionTime":"2026-02-19T19:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.948527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.948590 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.948608 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.948633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:37 crc kubenswrapper[4722]: I0219 19:19:37.948651 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:37Z","lastTransitionTime":"2026-02-19T19:19:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.051619 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.051689 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.051708 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.051733 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.051752 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.063190 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 13:04:01.257382823 +0000 UTC Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.070548 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.070602 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.070555 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:38 crc kubenswrapper[4722]: E0219 19:19:38.070744 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:38 crc kubenswrapper[4722]: E0219 19:19:38.070839 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:38 crc kubenswrapper[4722]: E0219 19:19:38.070957 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.155235 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.155301 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.155320 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.155351 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.155374 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.258703 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.258827 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.258853 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.258879 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.258898 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.362323 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.362412 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.362447 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.362476 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.362498 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.465713 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.465790 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.466019 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.466049 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.466072 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.569069 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.569146 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.569215 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.569253 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.569276 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.672626 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.672707 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.672732 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.672760 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.672777 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.775752 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.775809 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.775881 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.775944 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.775985 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.857890 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.857953 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.857971 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.857997 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.858014 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: E0219 19:19:38.881121 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:38Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.891387 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.891458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.891483 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.891512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.891534 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: E0219 19:19:38.908367 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:38Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.912602 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.912670 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.912696 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.912724 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.912746 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: E0219 19:19:38.934061 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:38Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.939675 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.939739 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.939757 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.939783 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.939801 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: E0219 19:19:38.959890 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:38Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.965134 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.965212 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.965231 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.965255 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.965273 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:38 crc kubenswrapper[4722]: E0219 19:19:38.986404 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7bf15e34-a3dc-4bfd-a83d-49c3d07d7868\\\",\\\"systemUUID\\\":\\\"4cf2b762-873e-4422-8170-f24281d6b9fa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:38Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:38 crc kubenswrapper[4722]: E0219 19:19:38.986681 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.989211 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.989267 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.989286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.989311 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:38 crc kubenswrapper[4722]: I0219 19:19:38.989330 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:38Z","lastTransitionTime":"2026-02-19T19:19:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.064184 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 23:34:30.023516193 +0000 UTC Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.071082 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:39 crc kubenswrapper[4722]: E0219 19:19:39.071888 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.072364 4722 scope.go:117] "RemoveContainer" containerID="841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.092111 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.092204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.092238 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.092269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.092291 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:39Z","lastTransitionTime":"2026-02-19T19:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.196701 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.197423 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.197455 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.197484 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.197505 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:39Z","lastTransitionTime":"2026-02-19T19:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.301337 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.301375 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.301390 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.301409 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.301420 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:39Z","lastTransitionTime":"2026-02-19T19:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.404763 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.404806 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.404818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.404834 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.404846 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:39Z","lastTransitionTime":"2026-02-19T19:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.507772 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.507817 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.507831 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.507847 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.507860 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:39Z","lastTransitionTime":"2026-02-19T19:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.540481 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/2.log" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.543008 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20"} Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.543561 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.565650 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:11Z\\\",\\\"message\\\":\\\"space event handler 5\\\\nI0219 19:19:11.030460 6395 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 19:19:11.030476 6395 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 19:19:11.030488 6395 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:19:11.030497 6395 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:19:11.030553 6395 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 19:19:11.030750 6395 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 19:19:11.031201 6395 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031258 6395 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031327 6395 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031451 6395 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.578494 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e17c102d-a233-4540-880b-372c023c3963\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80d8c4416a79661f1a02a815a53525f89e5c524706addaba1bde909dcaae9d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333558175a62ca9a2c09ea042f5ed04cbaa1c61dd2e87e39a87a6e6bbc5100ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333558175a62ca9a2c09ea042f5ed04cbaa1c61dd2e87e39a87a6e6bbc5100ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.591849 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.603796 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.610518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.610582 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.610605 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.610635 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.610649 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:39Z","lastTransitionTime":"2026-02-19T19:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.617492 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.631317 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.643795 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.657411 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.675512 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"2026-02-19T19:18:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e\\\\n2026-02-19T19:18:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e to /host/opt/cni/bin/\\\\n2026-02-19T19:18:43Z [verbose] multus-daemon started\\\\n2026-02-19T19:18:43Z [verbose] Readiness Indicator file check\\\\n2026-02-19T19:19:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.698335 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.712664 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.712703 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.712712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.712726 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.712739 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:39Z","lastTransitionTime":"2026-02-19T19:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.716378 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.729269 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.743860 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.753586 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.764703 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.774249 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.785398 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.798548 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:39Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.815752 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.815785 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.815794 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.815808 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.815816 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:39Z","lastTransitionTime":"2026-02-19T19:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.918584 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.918634 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.918645 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.918662 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:39 crc kubenswrapper[4722]: I0219 19:19:39.918675 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:39Z","lastTransitionTime":"2026-02-19T19:19:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.021965 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.022048 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.022070 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.022102 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.022127 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:40Z","lastTransitionTime":"2026-02-19T19:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.064853 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 15:34:30.186356001 +0000 UTC Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.070531 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.070602 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.070610 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:40 crc kubenswrapper[4722]: E0219 19:19:40.070754 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:40 crc kubenswrapper[4722]: E0219 19:19:40.070912 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:40 crc kubenswrapper[4722]: E0219 19:19:40.071132 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.125859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.125930 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.125952 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.125981 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.125999 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:40Z","lastTransitionTime":"2026-02-19T19:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.228672 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.228710 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.228720 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.228734 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.228744 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:40Z","lastTransitionTime":"2026-02-19T19:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.331472 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.331630 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.331644 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.331663 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.331677 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:40Z","lastTransitionTime":"2026-02-19T19:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.435272 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.435317 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.435328 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.435344 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.435358 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:40Z","lastTransitionTime":"2026-02-19T19:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.538310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.538355 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.538366 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.538383 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.538394 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:40Z","lastTransitionTime":"2026-02-19T19:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.550797 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/3.log" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.551902 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/2.log" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.555767 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" exitCode=1 Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.555835 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.555922 4722 scope.go:117] "RemoveContainer" containerID="841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.557077 4722 scope.go:117] "RemoveContainer" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:19:40 crc kubenswrapper[4722]: E0219 19:19:40.557433 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.583538 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.602237 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.620141 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.642048 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.644549 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.644607 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.644625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.644650 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.644667 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:40Z","lastTransitionTime":"2026-02-19T19:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.655976 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.674339 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.689253 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"2026-02-19T19:18:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e\\\\n2026-02-19T19:18:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e to /host/opt/cni/bin/\\\\n2026-02-19T19:18:43Z [verbose] multus-daemon started\\\\n2026-02-19T19:18:43Z [verbose] Readiness Indicator file check\\\\n2026-02-19T19:19:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.706217 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.720696 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.731388 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.739932 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.747123 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.747217 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.747240 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.747269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.747292 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:40Z","lastTransitionTime":"2026-02-19T19:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.752855 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.762455 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.773592 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.782324 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.794075 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.822182 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:11Z\\\",\\\"message\\\":\\\"space event handler 5\\\\nI0219 19:19:11.030460 6395 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 19:19:11.030476 6395 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 19:19:11.030488 6395 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:19:11.030497 6395 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:19:11.030553 6395 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 19:19:11.030750 6395 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 19:19:11.031201 6395 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031258 6395 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031327 6395 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031451 6395 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:39Z\\\",\\\"message\\\":\\\"io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007681e6b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.16],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0219 19:19:39.913543 6799 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/control-plane-machine-set-operator]} name:Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neig\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.840057 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e17c102d-a233-4540-880b-372c023c3963\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80d8c4416a79661f1a02a815a53525f89e5c524706addaba1bde909dcaae9d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333558175a62ca9a2c09ea042f5ed04cbaa1c61dd2e87e39a87a6e6bbc5100ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333558175a62ca9a2c09ea042f5ed04cbaa1c61dd2e87e39a87a6e6bbc5100ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:40Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.849943 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.849979 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.850008 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.850026 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.850040 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:40Z","lastTransitionTime":"2026-02-19T19:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.952647 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.952706 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.952746 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.952762 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:40 crc kubenswrapper[4722]: I0219 19:19:40.952772 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:40Z","lastTransitionTime":"2026-02-19T19:19:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.055588 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.055654 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.055671 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.055695 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.055713 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:41Z","lastTransitionTime":"2026-02-19T19:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.066027 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 14:49:59.929348512 +0000 UTC Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.070578 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:41 crc kubenswrapper[4722]: E0219 19:19:41.070698 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.086245 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.107456 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"2026-02-19T19:18:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e\\\\n2026-02-19T19:18:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e to /host/opt/cni/bin/\\\\n2026-02-19T19:18:43Z [verbose] multus-daemon started\\\\n2026-02-19T19:18:43Z [verbose] Readiness Indicator file check\\\\n2026-02-19T19:19:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.144323 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.159435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.159493 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.159511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.159538 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.159613 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:41Z","lastTransitionTime":"2026-02-19T19:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.172890 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.193204 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.205457 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.217225 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.233828 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.243066 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.254179 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.261915 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.261961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.261972 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.261990 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.262002 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:41Z","lastTransitionTime":"2026-02-19T19:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.269377 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.285235 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.295557 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.304370 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e17c102d-a233-4540-880b-372c023c3963\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80d8c4416a79661f1a02a815a53525f89e5c524706addaba1bde909dcaae9d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333558175a62ca9a2c09ea042f5ed04cbaa1c61dd2e87e39a87a6e6bbc5100ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333558175a62ca9a2c09ea042f5ed04cbaa1c61dd2e87e39a87a6e6bbc5100ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.320696 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.348416 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://841aba8809962ff1733f1f7a602cdeabbe404a6ace7dc1a29f0c9285a9a7cd9a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:11Z\\\",\\\"message\\\":\\\"space event handler 5\\\\nI0219 19:19:11.030460 6395 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 19:19:11.030476 6395 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 19:19:11.030488 6395 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 19:19:11.030497 6395 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 19:19:11.030553 6395 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0219 19:19:11.030750 6395 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 19:19:11.031201 6395 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031258 6395 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031327 6395 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 19:19:11.031451 6395 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:39Z\\\",\\\"message\\\":\\\"io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007681e6b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.16],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0219 19:19:39.913543 6799 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/control-plane-machine-set-operator]} name:Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neig\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.361172 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.363468 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.363500 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.363515 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.363530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.363539 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:41Z","lastTransitionTime":"2026-02-19T19:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.374175 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.465718 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.465762 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.465771 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.465784 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.465794 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:41Z","lastTransitionTime":"2026-02-19T19:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.560064 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/3.log" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.562728 4722 scope.go:117] "RemoveContainer" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:19:41 crc kubenswrapper[4722]: E0219 19:19:41.562848 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.567243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.567269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.567279 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.567291 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.567301 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:41Z","lastTransitionTime":"2026-02-19T19:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.579676 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f680526f4c6584b66756bcd44381a5c4b33488e1f7466d945c728aced1311939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4793bb71b667567adb24ee15eea99035a3e38d5566ed4d690daeedcc96182cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.610700 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb7c404-f96e-43a7-b20f-b45d856c75a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:39Z\\\",\\\"message\\\":\\\"io/serving-cert-secret-name:mcc-proxy-tls service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007681e6b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:9001,TargetPort:{0 9001 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: machine-config-controller,},ClusterIP:10.217.5.16,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.16],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0219 19:19:39.913543 6799 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/control-plane-machine-set-operator]} name:Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neig\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:19:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zjr2p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsfln\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.622315 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e17c102d-a233-4540-880b-372c023c3963\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80d8c4416a79661f1a02a815a53525f89e5c524706addaba1bde909dcaae9d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333558175a62ca9a2c09ea042f5ed04cbaa1c61dd2e87e39a87a6e6bbc5100ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://333558175a62ca9a2c09ea042f5ed04cbaa1c61dd2e87e39a87a6e6bbc5100ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.635052 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.651484 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.664620 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"abbfd53b-68db-4f79-8749-fe4bdebebc95\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f5ae0552604404f2b0bdc05a12734429847af0d1d83d272b2098b6a3662d23b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e569ce217b3778106f81ddd5e7ae8429997c44381b1e75077df952932727d61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d890362aefd1603e6106a04e01995f2f9144ec74e4e558114a0f6c98856a9ca4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.669299 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.669369 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.669379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.669394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.669403 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:41Z","lastTransitionTime":"2026-02-19T19:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.681367 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91f6482f2834333b9eb995d2a311054225d06a7c58961d2d3d2dc99b220eb11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.694810 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xq6bx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fad04006-ed10-4444-ae85-9c0a31a95466\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffaf57287deebb26d441478fd7cc496d0a84e4f1d58f40d1c497a98fc0e5c5cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vsjws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xq6bx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.705048 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b265ff4c-d096-4b39-8032-fe0b84354832\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://acfce7b1c07cc178ba67317c2e7ee3d2656a3cf806275c12b9651d98e0e8de37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkfk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-w8zrl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.715904 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jnvgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a80fcd7-8ac4-4e82-8f14-93d225898bb5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T19:19:28Z\\\",\\\"message\\\":\\\"2026-02-19T19:18:43+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e\\\\n2026-02-19T19:18:43+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cee07b4b-9ca7-4aa9-9ba4-5c569d107c6e to /host/opt/cni/bin/\\\\n2026-02-19T19:18:43Z [verbose] multus-daemon started\\\\n2026-02-19T19:18:43Z [verbose] Readiness Indicator file check\\\\n2026-02-19T19:19:28Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:19:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n829t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jnvgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.728392 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3974ea1e-a55a-4504-aec2-f9aab56fd6da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0219 19:18:34.756259 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 19:18:34.759013 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1768218956/tls.crt::/tmp/serving-cert-1768218956/tls.key\\\\\\\"\\\\nI0219 19:18:40.386241 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0219 19:18:40.390508 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0219 19:18:40.390529 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0219 19:18:40.390554 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0219 19:18:40.390564 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0219 19:18:40.399316 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0219 19:18:40.399313 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0219 19:18:40.399353 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399359 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0219 19:18:40.399365 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0219 19:18:40.399369 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0219 19:18:40.399373 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0219 19:18:40.399378 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0219 19:18:40.400464 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.742275 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b3c88da485edb0853032caf60405a8ebf43369b1429468378e36e7046ef58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.758410 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.771963 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.772023 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.772040 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.772061 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.772022 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-lwpgw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9734f69-4441-4618-849c-54e0aca328e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7f1e3853782e359da0c870cbccd7e25d8d92d470ee413192957b382aacb79d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bbv9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-lwpgw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.772076 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:41Z","lastTransitionTime":"2026-02-19T19:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.785207 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"285e13d6-a3ce-4bc2-9be4-bb6db3593a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb26f5eda88d7dbc72037ab2d4a708ad40ccf220aa7b2f91ab38882bedad9ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c53a5aac58e7c697cd0506228f13f45ee8deb2913766efccd688bedfd2943330\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e88312128b0e999d44b8108b98897127cd6f786bcb4010a4fe7512c95415df1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf26bbe64983f7d49132d3bba7195e0d44df0060fafd813c1037ec0be499c49b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cce5bed3ccb22f1041df4b4e6b4a69779619f5816054da8dcb95f8de2525441b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c67ed2c9fa3970f437b839c3b43c5cb1ead9026d73faf2769e0191bb6b2de009\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ad30bb81d8385616169c9b5b942278eb2f8ff4d2c0ad5b91cfef87c70da8f61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75c45\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7g5gg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.798327 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20b917d0-317d-4ce9-96e2-b1aa95f89663\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b43a0c72fd502363392223ce09cb6fcf8db36315d3c62116a38fe9ea90b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de870ca8410d45b3ac135d32c1b0f483fa934d8303bd39a5174e321e12a1a63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26lqm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qt9f4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.812009 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"493acad5-7300-4941-9311-19b3d5f21786\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:55Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpp4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:55Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-s6hhp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.822731 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e3e7241-c132-41ff-83a7-f2f49691ab84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:19:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T19:18:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://923f99e071489e8408d6b42461f5461bb3cb2341a60ff628179d2c638540368b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ce1f442044017bb5216aa5fbdc78df8c62131c65e0ceb25dc6601898afced14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0194ec044bfb34cc472a6b8e23b423e468fcbe641bcaa5591612f73f17567555\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T19:18:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://610fa4c060a51a0b5a0e3010b8b748f09e5cd2950aa27da430d4bce8271dc4f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T19:18:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T19:18:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T19:18:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T19:19:41Z is after 2025-08-24T17:21:41Z" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.874202 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.874255 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.874267 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.874284 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.874293 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:41Z","lastTransitionTime":"2026-02-19T19:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.977027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.977098 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.977120 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.977198 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:41 crc kubenswrapper[4722]: I0219 19:19:41.977224 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:41Z","lastTransitionTime":"2026-02-19T19:19:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.066767 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 04:14:43.971706931 +0000 UTC Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.071196 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.071245 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.071206 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:42 crc kubenswrapper[4722]: E0219 19:19:42.071357 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:42 crc kubenswrapper[4722]: E0219 19:19:42.071527 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:42 crc kubenswrapper[4722]: E0219 19:19:42.071688 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.080757 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.080814 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.080838 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.080867 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.080893 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:42Z","lastTransitionTime":"2026-02-19T19:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.185109 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.185348 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.185379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.185410 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.185433 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:42Z","lastTransitionTime":"2026-02-19T19:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.288390 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.288467 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.288674 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.288699 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.288977 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:42Z","lastTransitionTime":"2026-02-19T19:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.391009 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.391065 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.391088 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.391117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.391139 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:42Z","lastTransitionTime":"2026-02-19T19:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.494057 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.494203 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.494229 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.494260 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.494285 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:42Z","lastTransitionTime":"2026-02-19T19:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.596770 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.596836 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.596859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.596889 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.596915 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:42Z","lastTransitionTime":"2026-02-19T19:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.700076 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.700115 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.700131 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.700184 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.700202 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:42Z","lastTransitionTime":"2026-02-19T19:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.803001 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.803282 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.803380 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.803858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.803903 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:42Z","lastTransitionTime":"2026-02-19T19:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.906490 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.906548 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.906571 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.906598 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:42 crc kubenswrapper[4722]: I0219 19:19:42.906621 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:42Z","lastTransitionTime":"2026-02-19T19:19:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.008974 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.009036 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.009058 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.009085 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.009106 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:43Z","lastTransitionTime":"2026-02-19T19:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.067598 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 04:38:44.365254537 +0000 UTC Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.071372 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:43 crc kubenswrapper[4722]: E0219 19:19:43.071620 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.111863 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.111926 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.111947 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.111974 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.111994 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:43Z","lastTransitionTime":"2026-02-19T19:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.214362 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.214443 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.214470 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.214502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.214526 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:43Z","lastTransitionTime":"2026-02-19T19:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.317859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.317960 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.317979 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.318002 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.318016 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:43Z","lastTransitionTime":"2026-02-19T19:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.427684 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.427778 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.427808 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.427844 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.427886 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:43Z","lastTransitionTime":"2026-02-19T19:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.531388 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.531504 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.531523 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.531548 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.531565 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:43Z","lastTransitionTime":"2026-02-19T19:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.634754 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.634832 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.634850 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.634877 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.634897 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:43Z","lastTransitionTime":"2026-02-19T19:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.738115 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.738252 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.738273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.738300 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.738317 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:43Z","lastTransitionTime":"2026-02-19T19:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.841544 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.841619 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.841642 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.841671 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.841691 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:43Z","lastTransitionTime":"2026-02-19T19:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.944015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.944054 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.944066 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.944084 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:43 crc kubenswrapper[4722]: I0219 19:19:43.944097 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:43Z","lastTransitionTime":"2026-02-19T19:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.046774 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.046827 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.046843 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.046865 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.046882 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:44Z","lastTransitionTime":"2026-02-19T19:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.067717 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 20:13:23.722768903 +0000 UTC Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.071108 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.071204 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.071111 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.071314 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.071425 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.071552 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.150368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.150418 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.150429 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.150449 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.150462 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:44Z","lastTransitionTime":"2026-02-19T19:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.254449 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.254490 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.254506 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.254524 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.254541 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:44Z","lastTransitionTime":"2026-02-19T19:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.275705 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.275895 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.275854089 +0000 UTC m=+147.888204453 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.357717 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.357775 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.357789 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.357810 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.357932 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:44Z","lastTransitionTime":"2026-02-19T19:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.377708 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.377800 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.377913 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.377979 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378040 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378070 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378123 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378124 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378145 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378079 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378192 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378214 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378194 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.378140817 +0000 UTC m=+147.990491141 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378259 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.3782351 +0000 UTC m=+147.990585464 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378284 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.378271211 +0000 UTC m=+147.990621575 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:19:44 crc kubenswrapper[4722]: E0219 19:19:44.378306 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.378294701 +0000 UTC m=+147.990645065 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.461600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.461886 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.461905 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.461927 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.461945 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:44Z","lastTransitionTime":"2026-02-19T19:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.564982 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.565050 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.565069 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.565096 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.565115 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:44Z","lastTransitionTime":"2026-02-19T19:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.667167 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.667218 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.667230 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.667248 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.667262 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:44Z","lastTransitionTime":"2026-02-19T19:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.770204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.770258 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.770275 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.770300 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.770317 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:44Z","lastTransitionTime":"2026-02-19T19:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.872252 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.872340 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.872363 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.872395 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.872418 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:44Z","lastTransitionTime":"2026-02-19T19:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.975911 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.975979 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.976052 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.976077 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:44 crc kubenswrapper[4722]: I0219 19:19:44.976095 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:44Z","lastTransitionTime":"2026-02-19T19:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.068606 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 23:28:54.133234058 +0000 UTC Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.071305 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:45 crc kubenswrapper[4722]: E0219 19:19:45.071496 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.078606 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.078685 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.078700 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.078725 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.078741 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:45Z","lastTransitionTime":"2026-02-19T19:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.182322 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.182402 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.182424 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.182454 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.182481 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:45Z","lastTransitionTime":"2026-02-19T19:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.286125 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.286204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.286224 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.286246 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.286264 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:45Z","lastTransitionTime":"2026-02-19T19:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.389380 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.389428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.389445 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.389463 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.389478 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:45Z","lastTransitionTime":"2026-02-19T19:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.492095 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.492195 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.492223 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.492255 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.492282 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:45Z","lastTransitionTime":"2026-02-19T19:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.595490 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.595545 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.595562 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.595585 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.595603 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:45Z","lastTransitionTime":"2026-02-19T19:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.698518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.698588 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.698602 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.698654 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.698667 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:45Z","lastTransitionTime":"2026-02-19T19:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.802056 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.802109 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.802128 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.802195 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.802221 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:45Z","lastTransitionTime":"2026-02-19T19:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.905709 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.905769 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.905787 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.905812 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:45 crc kubenswrapper[4722]: I0219 19:19:45.905830 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:45Z","lastTransitionTime":"2026-02-19T19:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.008995 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.009029 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.009042 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.009059 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.009070 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:46Z","lastTransitionTime":"2026-02-19T19:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.069615 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 09:55:36.57119654 +0000 UTC Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.070938 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.071051 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.070938 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:46 crc kubenswrapper[4722]: E0219 19:19:46.071214 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:46 crc kubenswrapper[4722]: E0219 19:19:46.071292 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:46 crc kubenswrapper[4722]: E0219 19:19:46.071380 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.117883 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.117964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.117989 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.118022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.118045 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:46Z","lastTransitionTime":"2026-02-19T19:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.220716 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.220789 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.220814 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.220839 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.220857 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:46Z","lastTransitionTime":"2026-02-19T19:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.324402 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.324553 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.324584 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.324610 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.324629 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:46Z","lastTransitionTime":"2026-02-19T19:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.426524 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.426558 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.426568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.426582 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.426595 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:46Z","lastTransitionTime":"2026-02-19T19:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.529489 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.529517 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.529526 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.529539 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.529547 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:46Z","lastTransitionTime":"2026-02-19T19:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.632562 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.632633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.632659 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.632689 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.632714 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:46Z","lastTransitionTime":"2026-02-19T19:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.736278 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.736330 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.736347 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.736371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.736389 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:46Z","lastTransitionTime":"2026-02-19T19:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.839077 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.839114 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.839126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.839141 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.839175 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:46Z","lastTransitionTime":"2026-02-19T19:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.942072 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.942121 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.942133 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.942182 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:46 crc kubenswrapper[4722]: I0219 19:19:46.942195 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:46Z","lastTransitionTime":"2026-02-19T19:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.044272 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.044311 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.044320 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.044334 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.044344 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:47Z","lastTransitionTime":"2026-02-19T19:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.069823 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 00:34:20.00876621 +0000 UTC Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.072316 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:47 crc kubenswrapper[4722]: E0219 19:19:47.072493 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.147605 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.147672 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.147691 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.147715 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.147733 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:47Z","lastTransitionTime":"2026-02-19T19:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.250200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.250259 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.250276 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.250301 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.250322 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:47Z","lastTransitionTime":"2026-02-19T19:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.353864 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.353920 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.353937 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.353962 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.353981 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:47Z","lastTransitionTime":"2026-02-19T19:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.456862 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.456930 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.456948 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.456973 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.456990 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:47Z","lastTransitionTime":"2026-02-19T19:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.560535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.560606 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.560628 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.560655 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.560678 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:47Z","lastTransitionTime":"2026-02-19T19:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.663911 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.663991 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.664016 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.664049 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.664072 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:47Z","lastTransitionTime":"2026-02-19T19:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.766641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.766700 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.766747 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.766782 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.766804 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:47Z","lastTransitionTime":"2026-02-19T19:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.869091 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.869123 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.869133 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.869167 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.869179 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:47Z","lastTransitionTime":"2026-02-19T19:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.972086 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.972131 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.972144 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.972181 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:47 crc kubenswrapper[4722]: I0219 19:19:47.972194 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:47Z","lastTransitionTime":"2026-02-19T19:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.070135 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 13:05:39.528372624 +0000 UTC Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.070317 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.070388 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.070326 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:48 crc kubenswrapper[4722]: E0219 19:19:48.070462 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:48 crc kubenswrapper[4722]: E0219 19:19:48.070617 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:48 crc kubenswrapper[4722]: E0219 19:19:48.070678 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.073998 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.074027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.074037 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.074053 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.074065 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:48Z","lastTransitionTime":"2026-02-19T19:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.177210 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.177293 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.177314 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.177533 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.177551 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:48Z","lastTransitionTime":"2026-02-19T19:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.280680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.280731 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.280744 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.280766 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.280783 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:48Z","lastTransitionTime":"2026-02-19T19:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.383571 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.383653 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.383668 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.383687 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.383703 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:48Z","lastTransitionTime":"2026-02-19T19:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.486415 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.486482 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.486496 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.486521 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.486541 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:48Z","lastTransitionTime":"2026-02-19T19:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.590214 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.590275 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.590289 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.590311 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.590327 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:48Z","lastTransitionTime":"2026-02-19T19:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.694758 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.694825 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.694844 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.694948 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.694972 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:48Z","lastTransitionTime":"2026-02-19T19:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.798446 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.798524 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.798545 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.798573 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.798595 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:48Z","lastTransitionTime":"2026-02-19T19:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.903093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.903226 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.903238 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.903259 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:48 crc kubenswrapper[4722]: I0219 19:19:48.903271 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:48Z","lastTransitionTime":"2026-02-19T19:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.006625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.006683 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.006694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.006714 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.006729 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:49Z","lastTransitionTime":"2026-02-19T19:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.070973 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 17:14:34.340507253 +0000 UTC Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.071270 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:49 crc kubenswrapper[4722]: E0219 19:19:49.071612 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.110379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.110461 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.110494 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.110546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.110574 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:49Z","lastTransitionTime":"2026-02-19T19:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.206875 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.206960 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.206987 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.207129 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.207193 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:49Z","lastTransitionTime":"2026-02-19T19:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.240223 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.240281 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.240302 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.240332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.240353 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T19:19:49Z","lastTransitionTime":"2026-02-19T19:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.275555 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s"] Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.276097 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.280263 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.280533 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.280610 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.280553 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.338485 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.338585 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.338658 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.338708 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.338759 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.340741 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=63.340710849 podStartE2EDuration="1m3.340710849s" podCreationTimestamp="2026-02-19 19:18:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:49.340132991 +0000 UTC m=+88.952483355" watchObservedRunningTime="2026-02-19 19:19:49.340710849 +0000 UTC m=+88.953061203" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.341078 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.34106602 podStartE2EDuration="1m8.34106602s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:49.314205776 +0000 UTC m=+88.926556110" watchObservedRunningTime="2026-02-19 19:19:49.34106602 +0000 UTC m=+88.953416384" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.370241 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xq6bx" podStartSLOduration=69.370215236 podStartE2EDuration="1m9.370215236s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:49.369861395 +0000 UTC m=+88.982211759" watchObservedRunningTime="2026-02-19 19:19:49.370215236 +0000 UTC m=+88.982565580" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.382705 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podStartSLOduration=69.382679162 podStartE2EDuration="1m9.382679162s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:49.382261889 +0000 UTC m=+88.994612243" watchObservedRunningTime="2026-02-19 19:19:49.382679162 +0000 UTC m=+88.995029486" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.395902 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jnvgg" podStartSLOduration=69.395887492 podStartE2EDuration="1m9.395887492s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:49.395728917 +0000 UTC m=+89.008079251" watchObservedRunningTime="2026-02-19 19:19:49.395887492 +0000 UTC m=+89.008237816" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.426394 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=39.426378419 podStartE2EDuration="39.426378419s" podCreationTimestamp="2026-02-19 19:19:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:49.425902575 +0000 UTC m=+89.038252949" watchObservedRunningTime="2026-02-19 19:19:49.426378419 +0000 UTC m=+89.038728743" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.440370 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.440517 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.440592 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.440652 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.440702 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.440842 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.440942 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.442171 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.451941 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.461280 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-g7m5s\" (UID: \"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.493809 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7g5gg" podStartSLOduration=69.493791533 podStartE2EDuration="1m9.493791533s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:49.492565286 +0000 UTC m=+89.104915610" watchObservedRunningTime="2026-02-19 19:19:49.493791533 +0000 UTC m=+89.106141857" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.494066 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lwpgw" podStartSLOduration=69.494060052 podStartE2EDuration="1m9.494060052s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:49.471705547 +0000 UTC m=+89.084055871" watchObservedRunningTime="2026-02-19 19:19:49.494060052 +0000 UTC m=+89.106410376" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.518519 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qt9f4" podStartSLOduration=68.518498951 podStartE2EDuration="1m8.518498951s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:49.505989492 +0000 UTC m=+89.118339836" watchObservedRunningTime="2026-02-19 19:19:49.518498951 +0000 UTC m=+89.130849285" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.539047 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=12.539027968 podStartE2EDuration="12.539027968s" podCreationTimestamp="2026-02-19 19:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:49.537800851 +0000 UTC m=+89.150151195" watchObservedRunningTime="2026-02-19 19:19:49.539027968 +0000 UTC m=+89.151378302" Feb 19 19:19:49 crc kubenswrapper[4722]: I0219 19:19:49.595015 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" Feb 19 19:19:49 crc kubenswrapper[4722]: W0219 19:19:49.610977 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24e0afb8_cdad_45eb_a49f_e9f6ca11ec1b.slice/crio-5bb846eb635efc0e4472b1a1481d792e4b5b1a6015c48870a964fb45d2ddf064 WatchSource:0}: Error finding container 5bb846eb635efc0e4472b1a1481d792e4b5b1a6015c48870a964fb45d2ddf064: Status 404 returned error can't find the container with id 5bb846eb635efc0e4472b1a1481d792e4b5b1a6015c48870a964fb45d2ddf064 Feb 19 19:19:50 crc kubenswrapper[4722]: I0219 19:19:50.070679 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:50 crc kubenswrapper[4722]: I0219 19:19:50.070699 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:50 crc kubenswrapper[4722]: I0219 19:19:50.070702 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:50 crc kubenswrapper[4722]: I0219 19:19:50.071324 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 19:52:55.760042728 +0000 UTC Feb 19 19:19:50 crc kubenswrapper[4722]: I0219 19:19:50.071697 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 19 19:19:50 crc kubenswrapper[4722]: E0219 19:19:50.073014 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:50 crc kubenswrapper[4722]: E0219 19:19:50.073405 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:50 crc kubenswrapper[4722]: E0219 19:19:50.073687 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:50 crc kubenswrapper[4722]: I0219 19:19:50.082107 4722 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 19 19:19:50 crc kubenswrapper[4722]: I0219 19:19:50.590795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" event={"ID":"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b","Type":"ContainerStarted","Data":"57bb636fc11755ab566eb5dcb02b3dc00443892c8a61fd8931403141cf1eb485"} Feb 19 19:19:50 crc kubenswrapper[4722]: I0219 19:19:50.590852 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" event={"ID":"24e0afb8-cdad-45eb-a49f-e9f6ca11ec1b","Type":"ContainerStarted","Data":"5bb846eb635efc0e4472b1a1481d792e4b5b1a6015c48870a964fb45d2ddf064"} Feb 19 19:19:50 crc kubenswrapper[4722]: I0219 19:19:50.605046 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-g7m5s" podStartSLOduration=70.60502861 podStartE2EDuration="1m10.60502861s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:19:50.604401761 +0000 UTC m=+90.216752095" watchObservedRunningTime="2026-02-19 19:19:50.60502861 +0000 UTC m=+90.217378934" Feb 19 19:19:51 crc kubenswrapper[4722]: I0219 19:19:51.070646 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:51 crc kubenswrapper[4722]: E0219 19:19:51.071684 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:52 crc kubenswrapper[4722]: I0219 19:19:52.071044 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:52 crc kubenswrapper[4722]: I0219 19:19:52.071078 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:52 crc kubenswrapper[4722]: E0219 19:19:52.071205 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:52 crc kubenswrapper[4722]: I0219 19:19:52.071221 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:52 crc kubenswrapper[4722]: E0219 19:19:52.071293 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:52 crc kubenswrapper[4722]: E0219 19:19:52.071380 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:53 crc kubenswrapper[4722]: I0219 19:19:53.071095 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:53 crc kubenswrapper[4722]: E0219 19:19:53.071572 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:54 crc kubenswrapper[4722]: I0219 19:19:54.070734 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:54 crc kubenswrapper[4722]: I0219 19:19:54.070817 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:54 crc kubenswrapper[4722]: I0219 19:19:54.070743 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:54 crc kubenswrapper[4722]: E0219 19:19:54.070911 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:54 crc kubenswrapper[4722]: E0219 19:19:54.071005 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:54 crc kubenswrapper[4722]: E0219 19:19:54.071141 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:55 crc kubenswrapper[4722]: I0219 19:19:55.071128 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:55 crc kubenswrapper[4722]: E0219 19:19:55.071886 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:55 crc kubenswrapper[4722]: I0219 19:19:55.091631 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 19 19:19:56 crc kubenswrapper[4722]: I0219 19:19:56.071138 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:56 crc kubenswrapper[4722]: I0219 19:19:56.071288 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:56 crc kubenswrapper[4722]: E0219 19:19:56.071340 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:56 crc kubenswrapper[4722]: E0219 19:19:56.071501 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:56 crc kubenswrapper[4722]: I0219 19:19:56.071533 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:56 crc kubenswrapper[4722]: E0219 19:19:56.072052 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:56 crc kubenswrapper[4722]: I0219 19:19:56.072536 4722 scope.go:117] "RemoveContainer" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:19:56 crc kubenswrapper[4722]: E0219 19:19:56.072759 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" Feb 19 19:19:57 crc kubenswrapper[4722]: I0219 19:19:57.070403 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:57 crc kubenswrapper[4722]: E0219 19:19:57.070622 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:58 crc kubenswrapper[4722]: I0219 19:19:58.071232 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:19:58 crc kubenswrapper[4722]: I0219 19:19:58.071265 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:19:58 crc kubenswrapper[4722]: I0219 19:19:58.071340 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:19:58 crc kubenswrapper[4722]: E0219 19:19:58.072432 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:19:58 crc kubenswrapper[4722]: E0219 19:19:58.072698 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:19:58 crc kubenswrapper[4722]: E0219 19:19:58.072758 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:19:59 crc kubenswrapper[4722]: I0219 19:19:59.070940 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:59 crc kubenswrapper[4722]: E0219 19:19:59.071137 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:19:59 crc kubenswrapper[4722]: I0219 19:19:59.151581 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:19:59 crc kubenswrapper[4722]: E0219 19:19:59.151765 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:19:59 crc kubenswrapper[4722]: E0219 19:19:59.152362 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs podName:493acad5-7300-4941-9311-19b3d5f21786 nodeName:}" failed. No retries permitted until 2026-02-19 19:21:03.152325496 +0000 UTC m=+162.764675850 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs") pod "network-metrics-daemon-s6hhp" (UID: "493acad5-7300-4941-9311-19b3d5f21786") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 19:20:00 crc kubenswrapper[4722]: I0219 19:20:00.070685 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:00 crc kubenswrapper[4722]: I0219 19:20:00.070746 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:00 crc kubenswrapper[4722]: I0219 19:20:00.070748 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:00 crc kubenswrapper[4722]: E0219 19:20:00.070900 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:00 crc kubenswrapper[4722]: E0219 19:20:00.071063 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:00 crc kubenswrapper[4722]: E0219 19:20:00.071388 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:01 crc kubenswrapper[4722]: I0219 19:20:01.072448 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:01 crc kubenswrapper[4722]: E0219 19:20:01.072578 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:01 crc kubenswrapper[4722]: I0219 19:20:01.108801 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=6.108772576 podStartE2EDuration="6.108772576s" podCreationTimestamp="2026-02-19 19:19:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:01.105958869 +0000 UTC m=+100.718309243" watchObservedRunningTime="2026-02-19 19:20:01.108772576 +0000 UTC m=+100.721122940" Feb 19 19:20:02 crc kubenswrapper[4722]: I0219 19:20:02.071120 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:02 crc kubenswrapper[4722]: I0219 19:20:02.071255 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:02 crc kubenswrapper[4722]: I0219 19:20:02.071181 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:02 crc kubenswrapper[4722]: E0219 19:20:02.071358 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:02 crc kubenswrapper[4722]: E0219 19:20:02.071493 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:02 crc kubenswrapper[4722]: E0219 19:20:02.071660 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:03 crc kubenswrapper[4722]: I0219 19:20:03.071115 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:03 crc kubenswrapper[4722]: E0219 19:20:03.071397 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:04 crc kubenswrapper[4722]: I0219 19:20:04.070612 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:04 crc kubenswrapper[4722]: I0219 19:20:04.070657 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:04 crc kubenswrapper[4722]: I0219 19:20:04.070612 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:04 crc kubenswrapper[4722]: E0219 19:20:04.070808 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:04 crc kubenswrapper[4722]: E0219 19:20:04.070994 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:04 crc kubenswrapper[4722]: E0219 19:20:04.071073 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:05 crc kubenswrapper[4722]: I0219 19:20:05.070565 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:05 crc kubenswrapper[4722]: E0219 19:20:05.070712 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:06 crc kubenswrapper[4722]: I0219 19:20:06.071144 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:06 crc kubenswrapper[4722]: I0219 19:20:06.071200 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:06 crc kubenswrapper[4722]: I0219 19:20:06.071277 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:06 crc kubenswrapper[4722]: E0219 19:20:06.071463 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:06 crc kubenswrapper[4722]: E0219 19:20:06.071595 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:06 crc kubenswrapper[4722]: E0219 19:20:06.071730 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:07 crc kubenswrapper[4722]: I0219 19:20:07.070770 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:07 crc kubenswrapper[4722]: E0219 19:20:07.071699 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:07 crc kubenswrapper[4722]: I0219 19:20:07.071747 4722 scope.go:117] "RemoveContainer" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:20:07 crc kubenswrapper[4722]: E0219 19:20:07.072810 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsfln_openshift-ovn-kubernetes(5eb7c404-f96e-43a7-b20f-b45d856c75a5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" Feb 19 19:20:08 crc kubenswrapper[4722]: I0219 19:20:08.071106 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:08 crc kubenswrapper[4722]: I0219 19:20:08.071097 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:08 crc kubenswrapper[4722]: I0219 19:20:08.071097 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:08 crc kubenswrapper[4722]: E0219 19:20:08.071528 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:08 crc kubenswrapper[4722]: E0219 19:20:08.071704 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:08 crc kubenswrapper[4722]: E0219 19:20:08.072034 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:09 crc kubenswrapper[4722]: I0219 19:20:09.070429 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:09 crc kubenswrapper[4722]: E0219 19:20:09.070673 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:10 crc kubenswrapper[4722]: I0219 19:20:10.070570 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:10 crc kubenswrapper[4722]: E0219 19:20:10.070736 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:10 crc kubenswrapper[4722]: I0219 19:20:10.070974 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:10 crc kubenswrapper[4722]: E0219 19:20:10.071060 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:10 crc kubenswrapper[4722]: I0219 19:20:10.070567 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:10 crc kubenswrapper[4722]: E0219 19:20:10.071364 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:11 crc kubenswrapper[4722]: I0219 19:20:11.070504 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:11 crc kubenswrapper[4722]: E0219 19:20:11.072669 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:12 crc kubenswrapper[4722]: I0219 19:20:12.071363 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:12 crc kubenswrapper[4722]: I0219 19:20:12.071418 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:12 crc kubenswrapper[4722]: I0219 19:20:12.071526 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:12 crc kubenswrapper[4722]: E0219 19:20:12.071744 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:12 crc kubenswrapper[4722]: E0219 19:20:12.071868 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:12 crc kubenswrapper[4722]: E0219 19:20:12.071948 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:13 crc kubenswrapper[4722]: I0219 19:20:13.071123 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:13 crc kubenswrapper[4722]: E0219 19:20:13.071731 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:14 crc kubenswrapper[4722]: I0219 19:20:14.070713 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:14 crc kubenswrapper[4722]: I0219 19:20:14.070798 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:14 crc kubenswrapper[4722]: E0219 19:20:14.070851 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:14 crc kubenswrapper[4722]: I0219 19:20:14.070883 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:14 crc kubenswrapper[4722]: E0219 19:20:14.071063 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:14 crc kubenswrapper[4722]: E0219 19:20:14.071219 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:15 crc kubenswrapper[4722]: I0219 19:20:15.071440 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:15 crc kubenswrapper[4722]: E0219 19:20:15.071612 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:15 crc kubenswrapper[4722]: I0219 19:20:15.677579 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/1.log" Feb 19 19:20:15 crc kubenswrapper[4722]: I0219 19:20:15.678483 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/0.log" Feb 19 19:20:15 crc kubenswrapper[4722]: I0219 19:20:15.678561 4722 generic.go:334] "Generic (PLEG): container finished" podID="7a80fcd7-8ac4-4e82-8f14-93d225898bb5" containerID="38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16" exitCode=1 Feb 19 19:20:15 crc kubenswrapper[4722]: I0219 19:20:15.678617 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jnvgg" event={"ID":"7a80fcd7-8ac4-4e82-8f14-93d225898bb5","Type":"ContainerDied","Data":"38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16"} Feb 19 19:20:15 crc kubenswrapper[4722]: I0219 19:20:15.678672 4722 scope.go:117] "RemoveContainer" containerID="5bbdcdccf2d624e69f73de45d92aab6f353d8e131918f57e6406d08bc5524877" Feb 19 19:20:15 crc kubenswrapper[4722]: I0219 19:20:15.679343 4722 scope.go:117] "RemoveContainer" containerID="38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16" Feb 19 19:20:15 crc kubenswrapper[4722]: E0219 19:20:15.679626 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-jnvgg_openshift-multus(7a80fcd7-8ac4-4e82-8f14-93d225898bb5)\"" pod="openshift-multus/multus-jnvgg" podUID="7a80fcd7-8ac4-4e82-8f14-93d225898bb5" Feb 19 19:20:16 crc kubenswrapper[4722]: I0219 19:20:16.070546 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:16 crc kubenswrapper[4722]: I0219 19:20:16.070557 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:16 crc kubenswrapper[4722]: E0219 19:20:16.070823 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:16 crc kubenswrapper[4722]: I0219 19:20:16.070608 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:16 crc kubenswrapper[4722]: E0219 19:20:16.071004 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:16 crc kubenswrapper[4722]: E0219 19:20:16.071267 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:16 crc kubenswrapper[4722]: I0219 19:20:16.684779 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/1.log" Feb 19 19:20:17 crc kubenswrapper[4722]: I0219 19:20:17.070738 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:17 crc kubenswrapper[4722]: E0219 19:20:17.070949 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:18 crc kubenswrapper[4722]: I0219 19:20:18.070911 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:18 crc kubenswrapper[4722]: E0219 19:20:18.071139 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:18 crc kubenswrapper[4722]: I0219 19:20:18.071213 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:18 crc kubenswrapper[4722]: I0219 19:20:18.071257 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:18 crc kubenswrapper[4722]: E0219 19:20:18.071401 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:18 crc kubenswrapper[4722]: E0219 19:20:18.071526 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:19 crc kubenswrapper[4722]: I0219 19:20:19.071432 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:19 crc kubenswrapper[4722]: E0219 19:20:19.071633 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:20 crc kubenswrapper[4722]: I0219 19:20:20.070568 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:20 crc kubenswrapper[4722]: I0219 19:20:20.070649 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:20 crc kubenswrapper[4722]: E0219 19:20:20.070772 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:20 crc kubenswrapper[4722]: E0219 19:20:20.070987 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:20 crc kubenswrapper[4722]: I0219 19:20:20.071011 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:20 crc kubenswrapper[4722]: E0219 19:20:20.071267 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:21 crc kubenswrapper[4722]: E0219 19:20:21.036514 4722 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 19 19:20:21 crc kubenswrapper[4722]: I0219 19:20:21.073993 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:21 crc kubenswrapper[4722]: E0219 19:20:21.074248 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:21 crc kubenswrapper[4722]: I0219 19:20:21.074607 4722 scope.go:117] "RemoveContainer" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:20:21 crc kubenswrapper[4722]: E0219 19:20:21.167584 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:20:21 crc kubenswrapper[4722]: I0219 19:20:21.702196 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/3.log" Feb 19 19:20:21 crc kubenswrapper[4722]: I0219 19:20:21.709142 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerStarted","Data":"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d"} Feb 19 19:20:21 crc kubenswrapper[4722]: I0219 19:20:21.709544 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:20:21 crc kubenswrapper[4722]: I0219 19:20:21.886576 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podStartSLOduration=100.886557215 podStartE2EDuration="1m40.886557215s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:21.756468324 +0000 UTC m=+121.368818668" watchObservedRunningTime="2026-02-19 19:20:21.886557215 +0000 UTC m=+121.498907539" Feb 19 19:20:21 crc kubenswrapper[4722]: I0219 19:20:21.887174 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-s6hhp"] Feb 19 19:20:21 crc kubenswrapper[4722]: I0219 19:20:21.887295 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:21 crc kubenswrapper[4722]: E0219 19:20:21.887449 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:22 crc kubenswrapper[4722]: I0219 19:20:22.070915 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:22 crc kubenswrapper[4722]: E0219 19:20:22.071056 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:22 crc kubenswrapper[4722]: I0219 19:20:22.070945 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:22 crc kubenswrapper[4722]: E0219 19:20:22.071136 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:22 crc kubenswrapper[4722]: I0219 19:20:22.070932 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:22 crc kubenswrapper[4722]: E0219 19:20:22.071226 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:24 crc kubenswrapper[4722]: I0219 19:20:24.070242 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:24 crc kubenswrapper[4722]: I0219 19:20:24.070299 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:24 crc kubenswrapper[4722]: E0219 19:20:24.070486 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:24 crc kubenswrapper[4722]: I0219 19:20:24.070554 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:24 crc kubenswrapper[4722]: I0219 19:20:24.070642 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:24 crc kubenswrapper[4722]: E0219 19:20:24.070764 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:24 crc kubenswrapper[4722]: E0219 19:20:24.070957 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:24 crc kubenswrapper[4722]: E0219 19:20:24.071114 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:26 crc kubenswrapper[4722]: I0219 19:20:26.070588 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:26 crc kubenswrapper[4722]: E0219 19:20:26.070720 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:26 crc kubenswrapper[4722]: I0219 19:20:26.070587 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:26 crc kubenswrapper[4722]: E0219 19:20:26.070793 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:26 crc kubenswrapper[4722]: I0219 19:20:26.070592 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:26 crc kubenswrapper[4722]: I0219 19:20:26.070794 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:26 crc kubenswrapper[4722]: E0219 19:20:26.071007 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:26 crc kubenswrapper[4722]: E0219 19:20:26.070840 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:26 crc kubenswrapper[4722]: E0219 19:20:26.168823 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:20:28 crc kubenswrapper[4722]: I0219 19:20:28.070672 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:28 crc kubenswrapper[4722]: I0219 19:20:28.070686 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:28 crc kubenswrapper[4722]: I0219 19:20:28.071373 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:28 crc kubenswrapper[4722]: I0219 19:20:28.070755 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:28 crc kubenswrapper[4722]: E0219 19:20:28.071437 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:28 crc kubenswrapper[4722]: E0219 19:20:28.071551 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:28 crc kubenswrapper[4722]: E0219 19:20:28.071835 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:28 crc kubenswrapper[4722]: E0219 19:20:28.072370 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:29 crc kubenswrapper[4722]: I0219 19:20:29.071504 4722 scope.go:117] "RemoveContainer" containerID="38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16" Feb 19 19:20:29 crc kubenswrapper[4722]: I0219 19:20:29.746516 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/1.log" Feb 19 19:20:29 crc kubenswrapper[4722]: I0219 19:20:29.746595 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jnvgg" event={"ID":"7a80fcd7-8ac4-4e82-8f14-93d225898bb5","Type":"ContainerStarted","Data":"1d82d8ed7e562e39c1ca0e3f5b534a58cb4ab2f7fc1e4e4bea047ded2f5201a2"} Feb 19 19:20:30 crc kubenswrapper[4722]: I0219 19:20:30.070236 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:30 crc kubenswrapper[4722]: I0219 19:20:30.070274 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:30 crc kubenswrapper[4722]: I0219 19:20:30.070252 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:30 crc kubenswrapper[4722]: I0219 19:20:30.070237 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:30 crc kubenswrapper[4722]: E0219 19:20:30.070363 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-s6hhp" podUID="493acad5-7300-4941-9311-19b3d5f21786" Feb 19 19:20:30 crc kubenswrapper[4722]: E0219 19:20:30.070543 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 19:20:30 crc kubenswrapper[4722]: E0219 19:20:30.070691 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 19:20:30 crc kubenswrapper[4722]: E0219 19:20:30.070773 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 19:20:32 crc kubenswrapper[4722]: I0219 19:20:32.070537 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:20:32 crc kubenswrapper[4722]: I0219 19:20:32.070574 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:32 crc kubenswrapper[4722]: I0219 19:20:32.070863 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:32 crc kubenswrapper[4722]: I0219 19:20:32.070567 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:32 crc kubenswrapper[4722]: I0219 19:20:32.075878 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 19 19:20:32 crc kubenswrapper[4722]: I0219 19:20:32.075921 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 19 19:20:32 crc kubenswrapper[4722]: I0219 19:20:32.075985 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 19 19:20:32 crc kubenswrapper[4722]: I0219 19:20:32.076432 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 19 19:20:32 crc kubenswrapper[4722]: I0219 19:20:32.076723 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 19 19:20:32 crc kubenswrapper[4722]: I0219 19:20:32.077186 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.684503 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.726888 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.727720 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.728057 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.728390 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.729316 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.729633 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.730751 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-glfz9"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.731222 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.731458 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.731596 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.731617 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xn22j"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.731734 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.732015 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.732092 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.732178 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.732279 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.732185 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.732414 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ndzb8"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.732978 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.732993 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.733184 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.735042 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.735297 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.737135 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-z8gcw"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.737603 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.742966 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.743804 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.744243 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.744400 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.746124 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bg6mf"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.746762 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.746878 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.746973 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.747061 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.747216 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.747393 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.752361 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.752372 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.753658 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.753893 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.754092 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.754293 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.754686 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.756446 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.756721 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.756923 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.757065 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.757254 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.757466 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.757601 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.757834 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.758337 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.758820 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-lg2rd"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.759121 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.759501 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.759572 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.759741 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.759855 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.760038 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-lg2rd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.759858 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.774114 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.774638 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.816187 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.816619 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.817287 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.817392 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.818885 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.819425 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.820534 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.821486 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.821552 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.821746 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.821865 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.823905 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.823934 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.824069 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.824210 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.824362 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.824481 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.824589 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.825351 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.825396 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h4zk8"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.825745 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826046 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-txlzt"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826392 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6bqq"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826475 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826620 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826751 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826844 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826794 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckm69\" (UniqueName: \"kubernetes.io/projected/bf8b7b84-382a-410f-8dea-c4f485402a77-kube-api-access-ckm69\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826906 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb6fj\" (UniqueName: \"kubernetes.io/projected/a7788f82-4e6b-4d89-b009-0eca5b234009-kube-api-access-rb6fj\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826931 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjd4m\" (UniqueName: \"kubernetes.io/projected/ecc880c8-beb9-4081-8af6-64d2fa857901-kube-api-access-qjd4m\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826957 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c255c5e-d6d9-4772-9151-0065df6dc00d-audit-dir\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.826980 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-encryption-config\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.828606 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bf8b7b84-382a-410f-8dea-c4f485402a77-images\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.829021 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.829050 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jjcl\" (UniqueName: \"kubernetes.io/projected/47339628-7112-4f7a-b949-fef983428ebe-kube-api-access-9jjcl\") pod \"openshift-config-operator-7777fb866f-vrqgd\" (UID: \"47339628-7112-4f7a-b949-fef983428ebe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.829098 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.829124 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8c255c5e-d6d9-4772-9151-0065df6dc00d-node-pullsecrets\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.829667 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-config\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.829711 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbxd7\" (UniqueName: \"kubernetes.io/projected/b7b80c35-8f0b-4f44-af31-0b84ebddd4b8-kube-api-access-nbxd7\") pod \"downloads-7954f5f757-lg2rd\" (UID: \"b7b80c35-8f0b-4f44-af31-0b84ebddd4b8\") " pod="openshift-console/downloads-7954f5f757-lg2rd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.830806 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831286 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt2bj\" (UniqueName: \"kubernetes.io/projected/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-kube-api-access-tt2bj\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831324 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831348 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnn2s\" (UniqueName: \"kubernetes.io/projected/8c255c5e-d6d9-4772-9151-0065df6dc00d-kube-api-access-qnn2s\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831371 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831397 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831460 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-nzgmv"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.827122 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831453 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-serving-cert\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831639 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831673 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7788f82-4e6b-4d89-b009-0eca5b234009-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831701 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-client-ca\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831722 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-auth-proxy-config\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831752 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831776 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-audit\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831806 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-etcd-client\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831829 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-audit-policies\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831878 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-dir\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831904 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831926 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8c255c5e-d6d9-4772-9151-0065df6dc00d-encryption-config\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.831960 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47339628-7112-4f7a-b949-fef983428ebe-serving-cert\") pod \"openshift-config-operator-7777fb866f-vrqgd\" (UID: \"47339628-7112-4f7a-b949-fef983428ebe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.832023 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-config\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.828276 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.832052 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c255c5e-d6d9-4772-9151-0065df6dc00d-etcd-client\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.832078 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.832125 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.832171 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7788f82-4e6b-4d89-b009-0eca5b234009-config\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.832559 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.832722 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.832838 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.833170 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.833197 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.833309 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.833543 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834613 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-etcd-serving-ca\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834667 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-audit-dir\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834719 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834743 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-machine-approver-tls\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834766 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54srr\" (UniqueName: \"kubernetes.io/projected/c1782da0-924a-481b-b0fc-20050e168591-kube-api-access-54srr\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834797 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-config\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834821 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-image-import-ca\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834850 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834872 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ded6995-db61-4962-a375-ba80816b8df9-serving-cert\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834897 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-policies\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834922 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8rcx\" (UniqueName: \"kubernetes.io/projected/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-kube-api-access-j8rcx\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834955 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxcgd\" (UniqueName: \"kubernetes.io/projected/26779d4b-27e7-4bac-a4d8-5c312a6cec13-kube-api-access-cxcgd\") pod \"openshift-apiserver-operator-796bbdcf4f-576vp\" (UID: \"26779d4b-27e7-4bac-a4d8-5c312a6cec13\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.834981 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7788f82-4e6b-4d89-b009-0eca5b234009-serving-cert\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835005 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7788f82-4e6b-4d89-b009-0eca5b234009-service-ca-bundle\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835029 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26779d4b-27e7-4bac-a4d8-5c312a6cec13-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-576vp\" (UID: \"26779d4b-27e7-4bac-a4d8-5c312a6cec13\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835054 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26779d4b-27e7-4bac-a4d8-5c312a6cec13-config\") pod \"openshift-apiserver-operator-796bbdcf4f-576vp\" (UID: \"26779d4b-27e7-4bac-a4d8-5c312a6cec13\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835092 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835115 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c255c5e-d6d9-4772-9151-0065df6dc00d-serving-cert\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835173 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1782da0-924a-481b-b0fc-20050e168591-serving-cert\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835208 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzcqk\" (UniqueName: \"kubernetes.io/projected/4ded6995-db61-4962-a375-ba80816b8df9-kube-api-access-lzcqk\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835264 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/47339628-7112-4f7a-b949-fef983428ebe-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vrqgd\" (UID: \"47339628-7112-4f7a-b949-fef983428ebe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835294 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8b7b84-382a-410f-8dea-c4f485402a77-config\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835326 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835363 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf8b7b84-382a-410f-8dea-c4f485402a77-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835408 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-client-ca\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.835467 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-config\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.836422 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.836498 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.837109 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.837263 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.841917 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.841978 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.842131 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.842256 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.842376 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.847676 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.847929 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.852981 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.853170 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-p99c4"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.853621 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.870048 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.886455 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.886680 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.886805 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.886952 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.887911 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.888055 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.889204 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.889292 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.889817 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.890189 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.890615 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.891052 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.891300 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.891325 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.891321 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.891523 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.892395 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.892722 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.892959 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893132 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893090 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893345 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893462 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893570 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893679 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gtjsk"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893299 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893885 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893935 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.894046 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.894283 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893712 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.894490 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.894604 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.894058 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.894690 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.894739 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893860 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.893755 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.894979 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.897711 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.898740 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.900391 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.900861 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.901030 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.901501 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.903342 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-72z7j"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.904038 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.904479 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.904744 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.904797 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.905217 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.908184 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.908690 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.909259 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.909499 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.913276 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.913733 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4gbkr"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.914102 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.914350 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.914767 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.915428 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.915635 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.919248 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.919835 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.920039 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.920664 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.924262 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.924293 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-z8gcw"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.942138 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h4zk8"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.942273 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943294 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/47339628-7112-4f7a-b949-fef983428ebe-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vrqgd\" (UID: \"47339628-7112-4f7a-b949-fef983428ebe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943349 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8b7b84-382a-410f-8dea-c4f485402a77-config\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943376 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzcqk\" (UniqueName: \"kubernetes.io/projected/4ded6995-db61-4962-a375-ba80816b8df9-kube-api-access-lzcqk\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943613 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943636 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51679292-9818-418a-98d6-c442dc7d28e2-config\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943682 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943707 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf8b7b84-382a-410f-8dea-c4f485402a77-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943728 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-client-ca\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943776 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943800 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-config\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943844 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px8t8\" (UniqueName: \"kubernetes.io/projected/51679292-9818-418a-98d6-c442dc7d28e2-kube-api-access-px8t8\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943871 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjd4m\" (UniqueName: \"kubernetes.io/projected/ecc880c8-beb9-4081-8af6-64d2fa857901-kube-api-access-qjd4m\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943892 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c255c5e-d6d9-4772-9151-0065df6dc00d-audit-dir\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943938 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckm69\" (UniqueName: \"kubernetes.io/projected/bf8b7b84-382a-410f-8dea-c4f485402a77-kube-api-access-ckm69\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.943961 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb6fj\" (UniqueName: \"kubernetes.io/projected/a7788f82-4e6b-4d89-b009-0eca5b234009-kube-api-access-rb6fj\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944004 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7btmm\" (UniqueName: \"kubernetes.io/projected/2d21a014-83a9-43d9-9cdd-5e0897757c90-kube-api-access-7btmm\") pod \"cluster-samples-operator-665b6dd947-pwpjg\" (UID: \"2d21a014-83a9-43d9-9cdd-5e0897757c90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944033 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-encryption-config\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944057 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944106 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jjcl\" (UniqueName: \"kubernetes.io/projected/47339628-7112-4f7a-b949-fef983428ebe-kube-api-access-9jjcl\") pod \"openshift-config-operator-7777fb866f-vrqgd\" (UID: \"47339628-7112-4f7a-b949-fef983428ebe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944129 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bf8b7b84-382a-410f-8dea-c4f485402a77-images\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944181 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01bb1078-2d76-42f4-919f-3d1b73a61fd4-metrics-tls\") pod \"dns-operator-744455d44c-gtjsk\" (UID: \"01bb1078-2d76-42f4-919f-3d1b73a61fd4\") " pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944206 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wdwp\" (UniqueName: \"kubernetes.io/projected/01bb1078-2d76-42f4-919f-3d1b73a61fd4-kube-api-access-4wdwp\") pod \"dns-operator-744455d44c-gtjsk\" (UID: \"01bb1078-2d76-42f4-919f-3d1b73a61fd4\") " pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944261 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47j49\" (UniqueName: \"kubernetes.io/projected/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-kube-api-access-47j49\") pod \"olm-operator-6b444d44fb-cjtjp\" (UID: \"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944289 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkngc\" (UniqueName: \"kubernetes.io/projected/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-kube-api-access-vkngc\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944340 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944362 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8c255c5e-d6d9-4772-9151-0065df6dc00d-node-pullsecrets\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944425 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-config\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944450 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbxd7\" (UniqueName: \"kubernetes.io/projected/b7b80c35-8f0b-4f44-af31-0b84ebddd4b8-kube-api-access-nbxd7\") pod \"downloads-7954f5f757-lg2rd\" (UID: \"b7b80c35-8f0b-4f44-af31-0b84ebddd4b8\") " pod="openshift-console/downloads-7954f5f757-lg2rd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944512 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-srv-cert\") pod \"olm-operator-6b444d44fb-cjtjp\" (UID: \"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944538 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmbgg\" (UniqueName: \"kubernetes.io/projected/7c9da917-db10-4eba-bdff-f68354e8d4a6-kube-api-access-lmbgg\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944583 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/38df6625-e726-49e8-9bff-561442dcea53-metrics-tls\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944611 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944633 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt2bj\" (UniqueName: \"kubernetes.io/projected/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-kube-api-access-tt2bj\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944680 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-webhook-cert\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944705 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944776 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnn2s\" (UniqueName: \"kubernetes.io/projected/8c255c5e-d6d9-4772-9151-0065df6dc00d-kube-api-access-qnn2s\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944803 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944854 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944877 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-serving-cert\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944925 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.944961 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945012 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/51679292-9818-418a-98d6-c442dc7d28e2-etcd-client\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945038 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7788f82-4e6b-4d89-b009-0eca5b234009-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945087 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-client-ca\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945112 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-auth-proxy-config\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945173 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d21a014-83a9-43d9-9cdd-5e0897757c90-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pwpjg\" (UID: \"2d21a014-83a9-43d9-9cdd-5e0897757c90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945205 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945252 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-audit\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945283 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-etcd-client\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945305 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-audit-policies\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945355 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-dir\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945377 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945423 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8c255c5e-d6d9-4772-9151-0065df6dc00d-encryption-config\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945445 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38df6625-e726-49e8-9bff-561442dcea53-trusted-ca\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945506 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47339628-7112-4f7a-b949-fef983428ebe-serving-cert\") pod \"openshift-config-operator-7777fb866f-vrqgd\" (UID: \"47339628-7112-4f7a-b949-fef983428ebe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945538 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51679292-9818-418a-98d6-c442dc7d28e2-serving-cert\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945587 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc64p\" (UniqueName: \"kubernetes.io/projected/38df6625-e726-49e8-9bff-561442dcea53-kube-api-access-mc64p\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945615 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-config\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945673 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-apiservice-cert\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945706 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/51679292-9818-418a-98d6-c442dc7d28e2-etcd-service-ca\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945754 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c255c5e-d6d9-4772-9151-0065df6dc00d-etcd-client\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945783 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945852 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945903 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7788f82-4e6b-4d89-b009-0eca5b234009-config\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945933 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/51679292-9818-418a-98d6-c442dc7d28e2-etcd-ca\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.945993 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946017 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-etcd-serving-ca\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946036 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-audit-dir\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946084 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-machine-approver-tls\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946107 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-config\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946169 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-image-import-ca\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946197 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54srr\" (UniqueName: \"kubernetes.io/projected/c1782da0-924a-481b-b0fc-20050e168591-kube-api-access-54srr\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946243 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7c9da917-db10-4eba-bdff-f68354e8d4a6-tmpfs\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946266 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946315 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946340 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ded6995-db61-4962-a375-ba80816b8df9-serving-cert\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946379 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946365 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-policies\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946569 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8rcx\" (UniqueName: \"kubernetes.io/projected/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-kube-api-access-j8rcx\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946599 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxcgd\" (UniqueName: \"kubernetes.io/projected/26779d4b-27e7-4bac-a4d8-5c312a6cec13-kube-api-access-cxcgd\") pod \"openshift-apiserver-operator-796bbdcf4f-576vp\" (UID: \"26779d4b-27e7-4bac-a4d8-5c312a6cec13\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946623 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7788f82-4e6b-4d89-b009-0eca5b234009-serving-cert\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946646 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7788f82-4e6b-4d89-b009-0eca5b234009-service-ca-bundle\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946669 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cjtjp\" (UID: \"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946692 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8z64\" (UniqueName: \"kubernetes.io/projected/cb6886b7-9193-4c89-96c8-64b61c3251a4-kube-api-access-z8z64\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946741 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26779d4b-27e7-4bac-a4d8-5c312a6cec13-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-576vp\" (UID: \"26779d4b-27e7-4bac-a4d8-5c312a6cec13\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946765 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26779d4b-27e7-4bac-a4d8-5c312a6cec13-config\") pod \"openshift-apiserver-operator-796bbdcf4f-576vp\" (UID: \"26779d4b-27e7-4bac-a4d8-5c312a6cec13\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946809 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c255c5e-d6d9-4772-9151-0065df6dc00d-serving-cert\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946829 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1782da0-924a-481b-b0fc-20050e168591-serving-cert\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.946850 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38df6625-e726-49e8-9bff-561442dcea53-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.947710 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26779d4b-27e7-4bac-a4d8-5c312a6cec13-config\") pod \"openshift-apiserver-operator-796bbdcf4f-576vp\" (UID: \"26779d4b-27e7-4bac-a4d8-5c312a6cec13\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.949017 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-policies\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.949107 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xn22j"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.949189 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.949206 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-hxzjr"] Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.949622 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/47339628-7112-4f7a-b949-fef983428ebe-available-featuregates\") pod \"openshift-config-operator-7777fb866f-vrqgd\" (UID: \"47339628-7112-4f7a-b949-fef983428ebe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.950201 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.950680 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7788f82-4e6b-4d89-b009-0eca5b234009-service-ca-bundle\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.979487 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.982351 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.983142 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-audit\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.985493 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf8b7b84-382a-410f-8dea-c4f485402a77-config\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.989657 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-config\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.990841 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.991578 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26779d4b-27e7-4bac-a4d8-5c312a6cec13-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-576vp\" (UID: \"26779d4b-27e7-4bac-a4d8-5c312a6cec13\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.991886 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.992190 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.993927 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.999521 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-dir\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:40 crc kubenswrapper[4722]: I0219 19:20:40.995966 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-audit-policies\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:40.950850 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8c255c5e-d6d9-4772-9151-0065df6dc00d-node-pullsecrets\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:40.985790 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-auth-proxy-config\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:40.950946 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8c255c5e-d6d9-4772-9151-0065df6dc00d-audit-dir\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.008016 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-etcd-serving-ca\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.008321 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.008324 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1782da0-924a-481b-b0fc-20050e168591-serving-cert\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.008483 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-config\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.008520 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-client-ca\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.008599 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf8b7b84-382a-410f-8dea-c4f485402a77-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.008703 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c255c5e-d6d9-4772-9151-0065df6dc00d-serving-cert\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.008907 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.008923 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.009454 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7788f82-4e6b-4d89-b009-0eca5b234009-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.009516 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-image-import-ca\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.009638 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-client-ca\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.009811 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-config\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.009848 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7788f82-4e6b-4d89-b009-0eca5b234009-serving-cert\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.010411 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.010774 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.011067 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8c255c5e-d6d9-4772-9151-0065df6dc00d-encryption-config\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.011806 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bf8b7b84-382a-410f-8dea-c4f485402a77-images\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.012238 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-serving-cert\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.000756 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-audit-dir\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.015143 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ded6995-db61-4962-a375-ba80816b8df9-serving-cert\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.015260 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-glfz9"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.015582 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-machine-approver-tls\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.015853 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7788f82-4e6b-4d89-b009-0eca5b234009-config\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.016137 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-encryption-config\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.016942 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-etcd-client\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.016995 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.018920 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c255c5e-d6d9-4772-9151-0065df6dc00d-etcd-client\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.020489 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c255c5e-d6d9-4772-9151-0065df6dc00d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.020698 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47339628-7112-4f7a-b949-fef983428ebe-serving-cert\") pod \"openshift-config-operator-7777fb866f-vrqgd\" (UID: \"47339628-7112-4f7a-b949-fef983428ebe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.020756 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.021338 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-config\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.021660 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.022016 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.022054 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8ppnm"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.022081 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.022794 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.023738 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.023781 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.024020 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.025127 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6bqq"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.026526 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.026543 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ndzb8"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.027781 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-72z7j"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.029198 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nm78h"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.029777 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.030060 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.030441 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-lg2rd"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.031279 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.032464 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.033474 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.034432 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.035591 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.036361 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gtjsk"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.037336 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-txlzt"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.038270 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vcmxn"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.039334 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.039427 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vcmxn" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.040307 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.041275 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.042230 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.043278 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.044606 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.045773 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.046888 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bg6mf"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047343 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cjtjp\" (UID: \"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047370 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8z64\" (UniqueName: \"kubernetes.io/projected/cb6886b7-9193-4c89-96c8-64b61c3251a4-kube-api-access-z8z64\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047412 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38df6625-e726-49e8-9bff-561442dcea53-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047444 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047462 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51679292-9818-418a-98d6-c442dc7d28e2-config\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047481 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047507 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px8t8\" (UniqueName: \"kubernetes.io/projected/51679292-9818-418a-98d6-c442dc7d28e2-kube-api-access-px8t8\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047556 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7btmm\" (UniqueName: \"kubernetes.io/projected/2d21a014-83a9-43d9-9cdd-5e0897757c90-kube-api-access-7btmm\") pod \"cluster-samples-operator-665b6dd947-pwpjg\" (UID: \"2d21a014-83a9-43d9-9cdd-5e0897757c90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047586 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01bb1078-2d76-42f4-919f-3d1b73a61fd4-metrics-tls\") pod \"dns-operator-744455d44c-gtjsk\" (UID: \"01bb1078-2d76-42f4-919f-3d1b73a61fd4\") " pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047611 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wdwp\" (UniqueName: \"kubernetes.io/projected/01bb1078-2d76-42f4-919f-3d1b73a61fd4-kube-api-access-4wdwp\") pod \"dns-operator-744455d44c-gtjsk\" (UID: \"01bb1078-2d76-42f4-919f-3d1b73a61fd4\") " pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047633 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47j49\" (UniqueName: \"kubernetes.io/projected/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-kube-api-access-47j49\") pod \"olm-operator-6b444d44fb-cjtjp\" (UID: \"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047654 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkngc\" (UniqueName: \"kubernetes.io/projected/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-kube-api-access-vkngc\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047684 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-srv-cert\") pod \"olm-operator-6b444d44fb-cjtjp\" (UID: \"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047707 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmbgg\" (UniqueName: \"kubernetes.io/projected/7c9da917-db10-4eba-bdff-f68354e8d4a6-kube-api-access-lmbgg\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047729 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/38df6625-e726-49e8-9bff-561442dcea53-metrics-tls\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047757 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-webhook-cert\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047790 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047813 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/51679292-9818-418a-98d6-c442dc7d28e2-etcd-client\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047845 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d21a014-83a9-43d9-9cdd-5e0897757c90-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pwpjg\" (UID: \"2d21a014-83a9-43d9-9cdd-5e0897757c90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047873 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38df6625-e726-49e8-9bff-561442dcea53-trusted-ca\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047895 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4gbkr"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047895 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51679292-9818-418a-98d6-c442dc7d28e2-serving-cert\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047951 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc64p\" (UniqueName: \"kubernetes.io/projected/38df6625-e726-49e8-9bff-561442dcea53-kube-api-access-mc64p\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.047976 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-apiservice-cert\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.048026 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/51679292-9818-418a-98d6-c442dc7d28e2-etcd-service-ca\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.048059 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/51679292-9818-418a-98d6-c442dc7d28e2-etcd-ca\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.048102 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.048124 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7c9da917-db10-4eba-bdff-f68354e8d4a6-tmpfs\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.048801 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7c9da917-db10-4eba-bdff-f68354e8d4a6-tmpfs\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.049370 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.049539 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.050957 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8ppnm"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.051908 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2d21a014-83a9-43d9-9cdd-5e0897757c90-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-pwpjg\" (UID: \"2d21a014-83a9-43d9-9cdd-5e0897757c90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.051999 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vcmxn"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.053060 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-p99c4"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.054500 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.055553 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.056568 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nm78h"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.057547 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.058526 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.059607 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kqs9s"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.060490 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.060570 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kqs9s"] Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.069945 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.090347 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.119162 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.130786 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.150051 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.169824 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.190649 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.202345 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/38df6625-e726-49e8-9bff-561442dcea53-metrics-tls\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.210013 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.235842 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.239116 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38df6625-e726-49e8-9bff-561442dcea53-trusted-ca\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.249830 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.270566 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.290250 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.301032 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51679292-9818-418a-98d6-c442dc7d28e2-serving-cert\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.311908 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.318745 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/51679292-9818-418a-98d6-c442dc7d28e2-etcd-ca\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.330612 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.349980 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.358934 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51679292-9818-418a-98d6-c442dc7d28e2-config\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.369361 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/51679292-9818-418a-98d6-c442dc7d28e2-etcd-client\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.370579 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.389822 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.398759 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/51679292-9818-418a-98d6-c442dc7d28e2-etcd-service-ca\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.410224 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.430071 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.450841 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.472113 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.490112 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.509935 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.550601 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.569858 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.590595 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.611385 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.632672 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.650379 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.671065 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.683296 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01bb1078-2d76-42f4-919f-3d1b73a61fd4-metrics-tls\") pod \"dns-operator-744455d44c-gtjsk\" (UID: \"01bb1078-2d76-42f4-919f-3d1b73a61fd4\") " pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.691008 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.710292 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.731316 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.750636 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.771637 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.791872 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.798734 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.798805 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.811177 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.830688 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.850039 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.871520 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.890265 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.908899 4722 request.go:700] Waited for 1.003806087s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmultus-ac-dockercfg-9lkdf&limit=500&resourceVersion=0 Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.911113 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.930888 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.951022 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.970494 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 19 19:20:41 crc kubenswrapper[4722]: I0219 19:20:41.990368 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.001341 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-cjtjp\" (UID: \"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.010189 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.030049 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.047610 4722 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.047675 4722 secret.go:188] Couldn't get secret openshift-kube-storage-version-migrator-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.047698 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-trusted-ca podName:cb6886b7-9193-4c89-96c8-64b61c3251a4 nodeName:}" failed. No retries permitted until 2026-02-19 19:20:42.547672135 +0000 UTC m=+142.160022499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-trusted-ca") pod "marketplace-operator-79b997595-4gbkr" (UID: "cb6886b7-9193-4c89-96c8-64b61c3251a4") : failed to sync configmap cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.047799 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-serving-cert podName:c02c0f7a-9c0e-4d91-aca7-9648bace7d2f nodeName:}" failed. No retries permitted until 2026-02-19 19:20:42.547771598 +0000 UTC m=+142.160121962 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-serving-cert") pod "kube-storage-version-migrator-operator-b67b599dd-5xkn2" (UID: "c02c0f7a-9c0e-4d91-aca7-9648bace7d2f") : failed to sync secret cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.047837 4722 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.047882 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-srv-cert podName:e8ae2d71-7578-4343-a1ba-5d414cd1cc4b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:42.547865951 +0000 UTC m=+142.160216315 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-srv-cert") pod "olm-operator-6b444d44fb-cjtjp" (UID: "e8ae2d71-7578-4343-a1ba-5d414cd1cc4b") : failed to sync secret cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.047910 4722 secret.go:188] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.047952 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-operator-metrics podName:cb6886b7-9193-4c89-96c8-64b61c3251a4 nodeName:}" failed. No retries permitted until 2026-02-19 19:20:42.547940273 +0000 UTC m=+142.160290627 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-operator-metrics") pod "marketplace-operator-79b997595-4gbkr" (UID: "cb6886b7-9193-4c89-96c8-64b61c3251a4") : failed to sync secret cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.047970 4722 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.048033 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-webhook-cert podName:7c9da917-db10-4eba-bdff-f68354e8d4a6 nodeName:}" failed. No retries permitted until 2026-02-19 19:20:42.548016636 +0000 UTC m=+142.160366990 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-webhook-cert") pod "packageserver-d55dfcdfc-klvwp" (UID: "7c9da917-db10-4eba-bdff-f68354e8d4a6") : failed to sync secret cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.048879 4722 configmap.go:193] Couldn't get configMap openshift-kube-storage-version-migrator-operator/config: failed to sync configmap cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.048910 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-config podName:c02c0f7a-9c0e-4d91-aca7-9648bace7d2f nodeName:}" failed. No retries permitted until 2026-02-19 19:20:42.548899954 +0000 UTC m=+142.161250278 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-config") pod "kube-storage-version-migrator-operator-b67b599dd-5xkn2" (UID: "c02c0f7a-9c0e-4d91-aca7-9648bace7d2f") : failed to sync configmap cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.048923 4722 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: E0219 19:20:42.048991 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-apiservice-cert podName:7c9da917-db10-4eba-bdff-f68354e8d4a6 nodeName:}" failed. No retries permitted until 2026-02-19 19:20:42.548975577 +0000 UTC m=+142.161325941 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-apiservice-cert") pod "packageserver-d55dfcdfc-klvwp" (UID: "7c9da917-db10-4eba-bdff-f68354e8d4a6") : failed to sync secret cache: timed out waiting for the condition Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.050957 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.069848 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.090637 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.111328 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.130699 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.160483 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.170550 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.191132 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.211583 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.230846 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.250710 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.270854 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.290857 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.311648 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.330759 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.350359 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.370439 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.390339 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.409922 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.489443 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8rcx\" (UniqueName: \"kubernetes.io/projected/d33763f9-ec4f-4337-b9d4-f5c25ec6eabc-kube-api-access-j8rcx\") pod \"machine-approver-56656f9798-pxpb9\" (UID: \"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.506103 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxcgd\" (UniqueName: \"kubernetes.io/projected/26779d4b-27e7-4bac-a4d8-5c312a6cec13-kube-api-access-cxcgd\") pod \"openshift-apiserver-operator-796bbdcf4f-576vp\" (UID: \"26779d4b-27e7-4bac-a4d8-5c312a6cec13\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.511064 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzcqk\" (UniqueName: \"kubernetes.io/projected/4ded6995-db61-4962-a375-ba80816b8df9-kube-api-access-lzcqk\") pod \"controller-manager-879f6c89f-xn22j\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.511467 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.528840 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.530948 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.551895 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 19 19:20:42 crc kubenswrapper[4722]: W0219 19:20:42.554402 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd33763f9_ec4f_4337_b9d4_f5c25ec6eabc.slice/crio-8e4d2b050834c28fb8daa9df797f0568d0e53de94f4de3091921bec58b715e43 WatchSource:0}: Error finding container 8e4d2b050834c28fb8daa9df797f0568d0e53de94f4de3091921bec58b715e43: Status 404 returned error can't find the container with id 8e4d2b050834c28fb8daa9df797f0568d0e53de94f4de3091921bec58b715e43 Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.566037 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-srv-cert\") pod \"olm-operator-6b444d44fb-cjtjp\" (UID: \"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.566088 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-webhook-cert\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.566126 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.566202 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-apiservice-cert\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.566257 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.566309 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.566334 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.568530 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.568606 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.569998 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.570966 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-srv-cert\") pod \"olm-operator-6b444d44fb-cjtjp\" (UID: \"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.571326 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.572573 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-webhook-cert\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.573279 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7c9da917-db10-4eba-bdff-f68354e8d4a6-apiservice-cert\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.589065 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt2bj\" (UniqueName: \"kubernetes.io/projected/5693eddb-45a4-4cee-acb8-d3c0f23d16b8-kube-api-access-tt2bj\") pod \"apiserver-7bbb656c7d-6mfpq\" (UID: \"5693eddb-45a4-4cee-acb8-d3c0f23d16b8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.615060 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckm69\" (UniqueName: \"kubernetes.io/projected/bf8b7b84-382a-410f-8dea-c4f485402a77-kube-api-access-ckm69\") pod \"machine-api-operator-5694c8668f-glfz9\" (UID: \"bf8b7b84-382a-410f-8dea-c4f485402a77\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.625034 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.625555 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb6fj\" (UniqueName: \"kubernetes.io/projected/a7788f82-4e6b-4d89-b009-0eca5b234009-kube-api-access-rb6fj\") pod \"authentication-operator-69f744f599-z8gcw\" (UID: \"a7788f82-4e6b-4d89-b009-0eca5b234009\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.641966 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.647370 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbxd7\" (UniqueName: \"kubernetes.io/projected/b7b80c35-8f0b-4f44-af31-0b84ebddd4b8-kube-api-access-nbxd7\") pod \"downloads-7954f5f757-lg2rd\" (UID: \"b7b80c35-8f0b-4f44-af31-0b84ebddd4b8\") " pod="openshift-console/downloads-7954f5f757-lg2rd" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.674266 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.679050 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jjcl\" (UniqueName: \"kubernetes.io/projected/47339628-7112-4f7a-b949-fef983428ebe-kube-api-access-9jjcl\") pod \"openshift-config-operator-7777fb866f-vrqgd\" (UID: \"47339628-7112-4f7a-b949-fef983428ebe\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.693105 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjd4m\" (UniqueName: \"kubernetes.io/projected/ecc880c8-beb9-4081-8af6-64d2fa857901-kube-api-access-qjd4m\") pod \"oauth-openshift-558db77b4-ndzb8\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.708884 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnn2s\" (UniqueName: \"kubernetes.io/projected/8c255c5e-d6d9-4772-9151-0065df6dc00d-kube-api-access-qnn2s\") pod \"apiserver-76f77b778f-bg6mf\" (UID: \"8c255c5e-d6d9-4772-9151-0065df6dc00d\") " pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.719441 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.727543 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54srr\" (UniqueName: \"kubernetes.io/projected/c1782da0-924a-481b-b0fc-20050e168591-kube-api-access-54srr\") pod \"route-controller-manager-6576b87f9c-hj8tk\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.731586 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.749136 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.750953 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.772625 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.791110 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.810390 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.821361 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.830338 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.835277 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.838363 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" event={"ID":"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc","Type":"ContainerStarted","Data":"8e4d2b050834c28fb8daa9df797f0568d0e53de94f4de3091921bec58b715e43"} Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.842672 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-lg2rd" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.844517 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq"] Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.850895 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.857062 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:42 crc kubenswrapper[4722]: W0219 19:20:42.858313 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5693eddb_45a4_4cee_acb8_d3c0f23d16b8.slice/crio-f79630f1115840f2d9470a908c6aa100f0c2f7df67fa6c49602d64b93261c839 WatchSource:0}: Error finding container f79630f1115840f2d9470a908c6aa100f0c2f7df67fa6c49602d64b93261c839: Status 404 returned error can't find the container with id f79630f1115840f2d9470a908c6aa100f0c2f7df67fa6c49602d64b93261c839 Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.871623 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.873348 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp"] Feb 19 19:20:42 crc kubenswrapper[4722]: W0219 19:20:42.892762 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26779d4b_27e7_4bac_a4d8_5c312a6cec13.slice/crio-bacbd1ad79cad6f294317fd43a808d54eeaabb55461701a8a110ac85e14420bb WatchSource:0}: Error finding container bacbd1ad79cad6f294317fd43a808d54eeaabb55461701a8a110ac85e14420bb: Status 404 returned error can't find the container with id bacbd1ad79cad6f294317fd43a808d54eeaabb55461701a8a110ac85e14420bb Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.892963 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.910355 4722 request.go:700] Waited for 1.870719484s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.913116 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.932580 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.945261 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xn22j"] Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.952143 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.990564 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8z64\" (UniqueName: \"kubernetes.io/projected/cb6886b7-9193-4c89-96c8-64b61c3251a4-kube-api-access-z8z64\") pod \"marketplace-operator-79b997595-4gbkr\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.991426 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:42 crc kubenswrapper[4722]: I0219 19:20:42.992887 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.002364 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-glfz9"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.007682 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38df6625-e726-49e8-9bff-561442dcea53-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.030085 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7btmm\" (UniqueName: \"kubernetes.io/projected/2d21a014-83a9-43d9-9cdd-5e0897757c90-kube-api-access-7btmm\") pod \"cluster-samples-operator-665b6dd947-pwpjg\" (UID: \"2d21a014-83a9-43d9-9cdd-5e0897757c90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.047956 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47j49\" (UniqueName: \"kubernetes.io/projected/e8ae2d71-7578-4343-a1ba-5d414cd1cc4b-kube-api-access-47j49\") pod \"olm-operator-6b444d44fb-cjtjp\" (UID: \"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.065249 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wdwp\" (UniqueName: \"kubernetes.io/projected/01bb1078-2d76-42f4-919f-3d1b73a61fd4-kube-api-access-4wdwp\") pod \"dns-operator-744455d44c-gtjsk\" (UID: \"01bb1078-2d76-42f4-919f-3d1b73a61fd4\") " pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.087286 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px8t8\" (UniqueName: \"kubernetes.io/projected/51679292-9818-418a-98d6-c442dc7d28e2-kube-api-access-px8t8\") pod \"etcd-operator-b45778765-p99c4\" (UID: \"51679292-9818-418a-98d6-c442dc7d28e2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.118083 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkngc\" (UniqueName: \"kubernetes.io/projected/c02c0f7a-9c0e-4d91-aca7-9648bace7d2f-kube-api-access-vkngc\") pod \"kube-storage-version-migrator-operator-b67b599dd-5xkn2\" (UID: \"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.135655 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmbgg\" (UniqueName: \"kubernetes.io/projected/7c9da917-db10-4eba-bdff-f68354e8d4a6-kube-api-access-lmbgg\") pod \"packageserver-d55dfcdfc-klvwp\" (UID: \"7c9da917-db10-4eba-bdff-f68354e8d4a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.149698 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.150872 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.152322 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bg6mf"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.153623 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc64p\" (UniqueName: \"kubernetes.io/projected/38df6625-e726-49e8-9bff-561442dcea53-kube-api-access-mc64p\") pod \"ingress-operator-5b745b69d9-kvv76\" (UID: \"38df6625-e726-49e8-9bff-561442dcea53\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:43 crc kubenswrapper[4722]: W0219 19:20:43.165652 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c255c5e_d6d9_4772_9151_0065df6dc00d.slice/crio-accff4e6dd3fb27abdc912d4e1d0252a56be908fc7ae0600a9f72f81ca272868 WatchSource:0}: Error finding container accff4e6dd3fb27abdc912d4e1d0252a56be908fc7ae0600a9f72f81ca272868: Status 404 returned error can't find the container with id accff4e6dd3fb27abdc912d4e1d0252a56be908fc7ae0600a9f72f81ca272868 Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.174643 4722 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.196329 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.206543 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.233721 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.284298 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.289567 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-bound-sa-token\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.289852 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w8vrs\" (UID: \"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.289870 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3ea57e2-def2-4a73-a86b-75be99e36e46-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4kvb\" (UID: \"b3ea57e2-def2-4a73-a86b-75be99e36e46\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.289896 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f00e2406-a55b-4e28-bed9-a060b0780301-serving-cert\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.289914 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a159fd13-de0a-46d3-971f-fb7c2fc652bd-config\") pod \"service-ca-operator-777779d784-4jhs8\" (UID: \"a159fd13-de0a-46d3-971f-fb7c2fc652bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290051 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f25b1c29-b400-4bd5-8e63-ac31629a0aa2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvksz\" (UID: \"f25b1c29-b400-4bd5-8e63-ac31629a0aa2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290125 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-certificates\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290166 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290208 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-oauth-serving-cert\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290226 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqdj9\" (UniqueName: \"kubernetes.io/projected/187676b8-1029-4153-9da5-6614e9b7892e-kube-api-access-hqdj9\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290244 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f25b1c29-b400-4bd5-8e63-ac31629a0aa2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvksz\" (UID: \"f25b1c29-b400-4bd5-8e63-ac31629a0aa2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290264 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/669833fe-83b5-4d4a-a78c-c360789f754b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-z9r7q\" (UID: \"669833fe-83b5-4d4a-a78c-c360789f754b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290285 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-trusted-ca-bundle\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290303 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbbxv\" (UniqueName: \"kubernetes.io/projected/2d33e000-0a81-4601-8120-52dacf0b5d6b-kube-api-access-zbbxv\") pod \"migrator-59844c95c7-9sqrc\" (UID: \"2d33e000-0a81-4601-8120-52dacf0b5d6b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290325 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e040b47b-3688-40e2-a410-0dfa43ad8ef3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vh5vl\" (UID: \"e040b47b-3688-40e2-a410-0dfa43ad8ef3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290360 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-oauth-config\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290378 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2csz\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-kube-api-access-d2csz\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290396 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgsrz\" (UniqueName: \"kubernetes.io/projected/0d5e5981-45e4-4970-bff2-17a6087915e9-kube-api-access-xgsrz\") pod \"collect-profiles-29525475-wskf7\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290428 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3071e162-d262-4732-81ca-10bb9b507321-metrics-certs\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290457 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d5e5981-45e4-4970-bff2-17a6087915e9-config-volume\") pod \"collect-profiles-29525475-wskf7\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290473 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7-proxy-tls\") pod \"machine-config-controller-84d6567774-w8vrs\" (UID: \"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290509 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/809781cd-b87f-423a-957c-0d20e074306e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t6ljp\" (UID: \"809781cd-b87f-423a-957c-0d20e074306e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290534 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwxck\" (UniqueName: \"kubernetes.io/projected/e040b47b-3688-40e2-a410-0dfa43ad8ef3-kube-api-access-mwxck\") pod \"package-server-manager-789f6589d5-vh5vl\" (UID: \"e040b47b-3688-40e2-a410-0dfa43ad8ef3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290551 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7zg7\" (UniqueName: \"kubernetes.io/projected/f00e2406-a55b-4e28-bed9-a060b0780301-kube-api-access-h7zg7\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290568 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-tls\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290583 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8d31d88d-2e34-4b55-b843-b8a67b957680-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290600 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-serving-cert\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290615 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d5e5981-45e4-4970-bff2-17a6087915e9-secret-volume\") pod \"collect-profiles-29525475-wskf7\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290635 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzk45\" (UniqueName: \"kubernetes.io/projected/a159fd13-de0a-46d3-971f-fb7c2fc652bd-kube-api-access-zzk45\") pod \"service-ca-operator-777779d784-4jhs8\" (UID: \"a159fd13-de0a-46d3-971f-fb7c2fc652bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290651 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt4vb\" (UniqueName: \"kubernetes.io/projected/41fade82-0d8d-41b2-805e-8a92ffa97cf3-kube-api-access-rt4vb\") pod \"control-plane-machine-set-operator-78cbb6b69f-r4jmd\" (UID: \"41fade82-0d8d-41b2-805e-8a92ffa97cf3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290667 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a159fd13-de0a-46d3-971f-fb7c2fc652bd-serving-cert\") pod \"service-ca-operator-777779d784-4jhs8\" (UID: \"a159fd13-de0a-46d3-971f-fb7c2fc652bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290700 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290717 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3071e162-d262-4732-81ca-10bb9b507321-default-certificate\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290746 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/669833fe-83b5-4d4a-a78c-c360789f754b-config\") pod \"kube-controller-manager-operator-78b949d7b-z9r7q\" (UID: \"669833fe-83b5-4d4a-a78c-c360789f754b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290773 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9xdq\" (UniqueName: \"kubernetes.io/projected/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-kube-api-access-r9xdq\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290789 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-images\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290806 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f25b1c29-b400-4bd5-8e63-ac31629a0aa2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvksz\" (UID: \"f25b1c29-b400-4bd5-8e63-ac31629a0aa2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290822 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv46w\" (UniqueName: \"kubernetes.io/projected/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-kube-api-access-vv46w\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290841 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290857 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-trusted-ca\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290873 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b1bae0d8-92c9-40e9-ad8d-cc01467c8d93-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-72z7j\" (UID: \"b1bae0d8-92c9-40e9-ad8d-cc01467c8d93\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290887 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88gjx\" (UniqueName: \"kubernetes.io/projected/6a5359b9-b29a-4c86-8dc8-f00b659cecb0-kube-api-access-88gjx\") pod \"catalog-operator-68c6474976-hwl66\" (UID: \"6a5359b9-b29a-4c86-8dc8-f00b659cecb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290902 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3ea57e2-def2-4a73-a86b-75be99e36e46-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4kvb\" (UID: \"b3ea57e2-def2-4a73-a86b-75be99e36e46\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290935 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/41fade82-0d8d-41b2-805e-8a92ffa97cf3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-r4jmd\" (UID: \"41fade82-0d8d-41b2-805e-8a92ffa97cf3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.290994 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6a5359b9-b29a-4c86-8dc8-f00b659cecb0-srv-cert\") pod \"catalog-operator-68c6474976-hwl66\" (UID: \"6a5359b9-b29a-4c86-8dc8-f00b659cecb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291009 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3071e162-d262-4732-81ca-10bb9b507321-stats-auth\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291022 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f00e2406-a55b-4e28-bed9-a060b0780301-trusted-ca\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291042 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3071e162-d262-4732-81ca-10bb9b507321-service-ca-bundle\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291056 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/809781cd-b87f-423a-957c-0d20e074306e-config\") pod \"kube-apiserver-operator-766d6c64bb-t6ljp\" (UID: \"809781cd-b87f-423a-957c-0d20e074306e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291072 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291087 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/669833fe-83b5-4d4a-a78c-c360789f754b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-z9r7q\" (UID: \"669833fe-83b5-4d4a-a78c-c360789f754b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291103 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6a5359b9-b29a-4c86-8dc8-f00b659cecb0-profile-collector-cert\") pod \"catalog-operator-68c6474976-hwl66\" (UID: \"6a5359b9-b29a-4c86-8dc8-f00b659cecb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291117 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8l4m\" (UniqueName: \"kubernetes.io/projected/b3ea57e2-def2-4a73-a86b-75be99e36e46-kube-api-access-j8l4m\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4kvb\" (UID: \"b3ea57e2-def2-4a73-a86b-75be99e36e46\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291135 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8d31d88d-2e34-4b55-b843-b8a67b957680-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291164 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f00e2406-a55b-4e28-bed9-a060b0780301-config\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291210 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vwb2\" (UniqueName: \"kubernetes.io/projected/3071e162-d262-4732-81ca-10bb9b507321-kube-api-access-9vwb2\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291227 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291254 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/809781cd-b87f-423a-957c-0d20e074306e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t6ljp\" (UID: \"809781cd-b87f-423a-957c-0d20e074306e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291284 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-console-config\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291299 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-service-ca\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-proxy-tls\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291338 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsgwl\" (UniqueName: \"kubernetes.io/projected/b1bae0d8-92c9-40e9-ad8d-cc01467c8d93-kube-api-access-zsgwl\") pod \"multus-admission-controller-857f4d67dd-72z7j\" (UID: \"b1bae0d8-92c9-40e9-ad8d-cc01467c8d93\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.291354 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9mt8\" (UniqueName: \"kubernetes.io/projected/9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7-kube-api-access-t9mt8\") pod \"machine-config-controller-84d6567774-w8vrs\" (UID: \"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:43 crc kubenswrapper[4722]: E0219 19:20:43.297519 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:43.797502366 +0000 UTC m=+143.409852760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.299637 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.301030 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-z8gcw"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.307514 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.316821 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.392823 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:43 crc kubenswrapper[4722]: E0219 19:20:43.392996 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:43.892971166 +0000 UTC m=+143.505321490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393084 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2csz\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-kube-api-access-d2csz\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393119 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgsrz\" (UniqueName: \"kubernetes.io/projected/0d5e5981-45e4-4970-bff2-17a6087915e9-kube-api-access-xgsrz\") pod \"collect-profiles-29525475-wskf7\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393186 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-mountpoint-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393262 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-socket-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393300 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3071e162-d262-4732-81ca-10bb9b507321-metrics-certs\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393345 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d5e5981-45e4-4970-bff2-17a6087915e9-config-volume\") pod \"collect-profiles-29525475-wskf7\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393372 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7-proxy-tls\") pod \"machine-config-controller-84d6567774-w8vrs\" (UID: \"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393422 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/809781cd-b87f-423a-957c-0d20e074306e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t6ljp\" (UID: \"809781cd-b87f-423a-957c-0d20e074306e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393444 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwxck\" (UniqueName: \"kubernetes.io/projected/e040b47b-3688-40e2-a410-0dfa43ad8ef3-kube-api-access-mwxck\") pod \"package-server-manager-789f6589d5-vh5vl\" (UID: \"e040b47b-3688-40e2-a410-0dfa43ad8ef3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393466 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7zg7\" (UniqueName: \"kubernetes.io/projected/f00e2406-a55b-4e28-bed9-a060b0780301-kube-api-access-h7zg7\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393487 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/03c35cd8-4a1b-4847-a5f2-0fe0e884d191-signing-key\") pod \"service-ca-9c57cc56f-8ppnm\" (UID: \"03c35cd8-4a1b-4847-a5f2-0fe0e884d191\") " pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393509 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-tls\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393534 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8d31d88d-2e34-4b55-b843-b8a67b957680-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393559 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/814d776b-73c6-4354-8195-da5d3ea2d5cb-node-bootstrap-token\") pod \"machine-config-server-hxzjr\" (UID: \"814d776b-73c6-4354-8195-da5d3ea2d5cb\") " pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393592 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-serving-cert\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393615 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d5e5981-45e4-4970-bff2-17a6087915e9-secret-volume\") pod \"collect-profiles-29525475-wskf7\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393664 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzk45\" (UniqueName: \"kubernetes.io/projected/a159fd13-de0a-46d3-971f-fb7c2fc652bd-kube-api-access-zzk45\") pod \"service-ca-operator-777779d784-4jhs8\" (UID: \"a159fd13-de0a-46d3-971f-fb7c2fc652bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393687 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt4vb\" (UniqueName: \"kubernetes.io/projected/41fade82-0d8d-41b2-805e-8a92ffa97cf3-kube-api-access-rt4vb\") pod \"control-plane-machine-set-operator-78cbb6b69f-r4jmd\" (UID: \"41fade82-0d8d-41b2-805e-8a92ffa97cf3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393712 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a159fd13-de0a-46d3-971f-fb7c2fc652bd-serving-cert\") pod \"service-ca-operator-777779d784-4jhs8\" (UID: \"a159fd13-de0a-46d3-971f-fb7c2fc652bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393741 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393764 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3071e162-d262-4732-81ca-10bb9b507321-default-certificate\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393788 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/669833fe-83b5-4d4a-a78c-c360789f754b-config\") pod \"kube-controller-manager-operator-78b949d7b-z9r7q\" (UID: \"669833fe-83b5-4d4a-a78c-c360789f754b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393818 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9xdq\" (UniqueName: \"kubernetes.io/projected/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-kube-api-access-r9xdq\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393841 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-images\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393875 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f25b1c29-b400-4bd5-8e63-ac31629a0aa2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvksz\" (UID: \"f25b1c29-b400-4bd5-8e63-ac31629a0aa2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.393899 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv46w\" (UniqueName: \"kubernetes.io/projected/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-kube-api-access-vv46w\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394311 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394340 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3ea57e2-def2-4a73-a86b-75be99e36e46-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4kvb\" (UID: \"b3ea57e2-def2-4a73-a86b-75be99e36e46\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394362 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/03c35cd8-4a1b-4847-a5f2-0fe0e884d191-signing-cabundle\") pod \"service-ca-9c57cc56f-8ppnm\" (UID: \"03c35cd8-4a1b-4847-a5f2-0fe0e884d191\") " pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394383 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-trusted-ca\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394399 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b1bae0d8-92c9-40e9-ad8d-cc01467c8d93-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-72z7j\" (UID: \"b1bae0d8-92c9-40e9-ad8d-cc01467c8d93\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394417 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88gjx\" (UniqueName: \"kubernetes.io/projected/6a5359b9-b29a-4c86-8dc8-f00b659cecb0-kube-api-access-88gjx\") pod \"catalog-operator-68c6474976-hwl66\" (UID: \"6a5359b9-b29a-4c86-8dc8-f00b659cecb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394444 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/41fade82-0d8d-41b2-805e-8a92ffa97cf3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-r4jmd\" (UID: \"41fade82-0d8d-41b2-805e-8a92ffa97cf3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394463 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l86s9\" (UniqueName: \"kubernetes.io/projected/71548ff6-f831-48ba-af51-99fe431c447a-kube-api-access-l86s9\") pod \"dns-default-nm78h\" (UID: \"71548ff6-f831-48ba-af51-99fe431c447a\") " pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394477 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvjsx\" (UniqueName: \"kubernetes.io/projected/814d776b-73c6-4354-8195-da5d3ea2d5cb-kube-api-access-xvjsx\") pod \"machine-config-server-hxzjr\" (UID: \"814d776b-73c6-4354-8195-da5d3ea2d5cb\") " pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394537 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtkgt\" (UniqueName: \"kubernetes.io/projected/03c35cd8-4a1b-4847-a5f2-0fe0e884d191-kube-api-access-vtkgt\") pod \"service-ca-9c57cc56f-8ppnm\" (UID: \"03c35cd8-4a1b-4847-a5f2-0fe0e884d191\") " pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6a5359b9-b29a-4c86-8dc8-f00b659cecb0-srv-cert\") pod \"catalog-operator-68c6474976-hwl66\" (UID: \"6a5359b9-b29a-4c86-8dc8-f00b659cecb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394574 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3071e162-d262-4732-81ca-10bb9b507321-stats-auth\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394592 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f00e2406-a55b-4e28-bed9-a060b0780301-trusted-ca\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394613 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3071e162-d262-4732-81ca-10bb9b507321-service-ca-bundle\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394629 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/809781cd-b87f-423a-957c-0d20e074306e-config\") pod \"kube-apiserver-operator-766d6c64bb-t6ljp\" (UID: \"809781cd-b87f-423a-957c-0d20e074306e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394645 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6a5359b9-b29a-4c86-8dc8-f00b659cecb0-profile-collector-cert\") pod \"catalog-operator-68c6474976-hwl66\" (UID: \"6a5359b9-b29a-4c86-8dc8-f00b659cecb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394662 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8l4m\" (UniqueName: \"kubernetes.io/projected/b3ea57e2-def2-4a73-a86b-75be99e36e46-kube-api-access-j8l4m\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4kvb\" (UID: \"b3ea57e2-def2-4a73-a86b-75be99e36e46\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394682 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394697 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/669833fe-83b5-4d4a-a78c-c360789f754b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-z9r7q\" (UID: \"669833fe-83b5-4d4a-a78c-c360789f754b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394713 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f00e2406-a55b-4e28-bed9-a060b0780301-config\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394730 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8d31d88d-2e34-4b55-b843-b8a67b957680-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394779 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vwb2\" (UniqueName: \"kubernetes.io/projected/3071e162-d262-4732-81ca-10bb9b507321-kube-api-access-9vwb2\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394795 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394824 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/809781cd-b87f-423a-957c-0d20e074306e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t6ljp\" (UID: \"809781cd-b87f-423a-957c-0d20e074306e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394843 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2pqg\" (UniqueName: \"kubernetes.io/projected/bb502645-30c6-437d-abc3-28de80105939-kube-api-access-m2pqg\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394864 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-registration-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394882 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-service-ca\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394898 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-proxy-tls\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394915 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-console-config\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394935 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsgwl\" (UniqueName: \"kubernetes.io/projected/b1bae0d8-92c9-40e9-ad8d-cc01467c8d93-kube-api-access-zsgwl\") pod \"multus-admission-controller-857f4d67dd-72z7j\" (UID: \"b1bae0d8-92c9-40e9-ad8d-cc01467c8d93\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394953 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9mt8\" (UniqueName: \"kubernetes.io/projected/9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7-kube-api-access-t9mt8\") pod \"machine-config-controller-84d6567774-w8vrs\" (UID: \"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394978 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-bound-sa-token\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.394998 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w8vrs\" (UID: \"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395016 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f00e2406-a55b-4e28-bed9-a060b0780301-serving-cert\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395032 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3ea57e2-def2-4a73-a86b-75be99e36e46-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4kvb\" (UID: \"b3ea57e2-def2-4a73-a86b-75be99e36e46\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395068 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a159fd13-de0a-46d3-971f-fb7c2fc652bd-config\") pod \"service-ca-operator-777779d784-4jhs8\" (UID: \"a159fd13-de0a-46d3-971f-fb7c2fc652bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395084 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/814d776b-73c6-4354-8195-da5d3ea2d5cb-certs\") pod \"machine-config-server-hxzjr\" (UID: \"814d776b-73c6-4354-8195-da5d3ea2d5cb\") " pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395104 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f25b1c29-b400-4bd5-8e63-ac31629a0aa2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvksz\" (UID: \"f25b1c29-b400-4bd5-8e63-ac31629a0aa2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395119 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-csi-data-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395146 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71548ff6-f831-48ba-af51-99fe431c447a-config-volume\") pod \"dns-default-nm78h\" (UID: \"71548ff6-f831-48ba-af51-99fe431c447a\") " pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395198 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-certificates\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395235 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82lfl\" (UniqueName: \"kubernetes.io/projected/2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80-kube-api-access-82lfl\") pod \"ingress-canary-vcmxn\" (UID: \"2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80\") " pod="openshift-ingress-canary/ingress-canary-vcmxn" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395255 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395276 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-oauth-serving-cert\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395294 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqdj9\" (UniqueName: \"kubernetes.io/projected/187676b8-1029-4153-9da5-6614e9b7892e-kube-api-access-hqdj9\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395310 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/71548ff6-f831-48ba-af51-99fe431c447a-metrics-tls\") pod \"dns-default-nm78h\" (UID: \"71548ff6-f831-48ba-af51-99fe431c447a\") " pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395328 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f25b1c29-b400-4bd5-8e63-ac31629a0aa2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvksz\" (UID: \"f25b1c29-b400-4bd5-8e63-ac31629a0aa2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395349 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-trusted-ca-bundle\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395365 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/669833fe-83b5-4d4a-a78c-c360789f754b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-z9r7q\" (UID: \"669833fe-83b5-4d4a-a78c-c360789f754b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395379 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80-cert\") pod \"ingress-canary-vcmxn\" (UID: \"2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80\") " pod="openshift-ingress-canary/ingress-canary-vcmxn" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395405 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbbxv\" (UniqueName: \"kubernetes.io/projected/2d33e000-0a81-4601-8120-52dacf0b5d6b-kube-api-access-zbbxv\") pod \"migrator-59844c95c7-9sqrc\" (UID: \"2d33e000-0a81-4601-8120-52dacf0b5d6b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395458 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e040b47b-3688-40e2-a410-0dfa43ad8ef3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vh5vl\" (UID: \"e040b47b-3688-40e2-a410-0dfa43ad8ef3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-oauth-config\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.395504 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-plugins-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.396694 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-images\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: E0219 19:20:43.396878 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:43.89686789 +0000 UTC m=+143.509218214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.398224 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d5e5981-45e4-4970-bff2-17a6087915e9-config-volume\") pod \"collect-profiles-29525475-wskf7\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.398434 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.399738 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-service-ca\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.401475 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3ea57e2-def2-4a73-a86b-75be99e36e46-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4kvb\" (UID: \"b3ea57e2-def2-4a73-a86b-75be99e36e46\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.402304 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8d31d88d-2e34-4b55-b843-b8a67b957680-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.402417 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a159fd13-de0a-46d3-971f-fb7c2fc652bd-config\") pod \"service-ca-operator-777779d784-4jhs8\" (UID: \"a159fd13-de0a-46d3-971f-fb7c2fc652bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.403867 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w8vrs\" (UID: \"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.405300 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-console-config\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.405928 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-oauth-serving-cert\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.411377 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-certificates\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.411397 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f00e2406-a55b-4e28-bed9-a060b0780301-config\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.411787 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f00e2406-a55b-4e28-bed9-a060b0780301-serving-cert\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.412389 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d5e5981-45e4-4970-bff2-17a6087915e9-secret-volume\") pod \"collect-profiles-29525475-wskf7\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.413132 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/669833fe-83b5-4d4a-a78c-c360789f754b-config\") pod \"kube-controller-manager-operator-78b949d7b-z9r7q\" (UID: \"669833fe-83b5-4d4a-a78c-c360789f754b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.413204 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-trusted-ca-bundle\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.413636 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f25b1c29-b400-4bd5-8e63-ac31629a0aa2-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvksz\" (UID: \"f25b1c29-b400-4bd5-8e63-ac31629a0aa2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.416374 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.419249 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-trusted-ca\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.419819 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3071e162-d262-4732-81ca-10bb9b507321-metrics-certs\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.421338 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-serving-cert\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.421742 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a159fd13-de0a-46d3-971f-fb7c2fc652bd-serving-cert\") pod \"service-ca-operator-777779d784-4jhs8\" (UID: \"a159fd13-de0a-46d3-971f-fb7c2fc652bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.421997 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-proxy-tls\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.422298 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/669833fe-83b5-4d4a-a78c-c360789f754b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-z9r7q\" (UID: \"669833fe-83b5-4d4a-a78c-c360789f754b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.425558 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f25b1c29-b400-4bd5-8e63-ac31629a0aa2-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvksz\" (UID: \"f25b1c29-b400-4bd5-8e63-ac31629a0aa2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.425836 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-tls\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.425995 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6a5359b9-b29a-4c86-8dc8-f00b659cecb0-srv-cert\") pod \"catalog-operator-68c6474976-hwl66\" (UID: \"6a5359b9-b29a-4c86-8dc8-f00b659cecb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.426111 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e040b47b-3688-40e2-a410-0dfa43ad8ef3-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vh5vl\" (UID: \"e040b47b-3688-40e2-a410-0dfa43ad8ef3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.426173 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-oauth-config\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.426388 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/809781cd-b87f-423a-957c-0d20e074306e-config\") pod \"kube-apiserver-operator-766d6c64bb-t6ljp\" (UID: \"809781cd-b87f-423a-957c-0d20e074306e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.428010 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f00e2406-a55b-4e28-bed9-a060b0780301-trusted-ca\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.429672 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3071e162-d262-4732-81ca-10bb9b507321-stats-auth\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.429863 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6a5359b9-b29a-4c86-8dc8-f00b659cecb0-profile-collector-cert\") pod \"catalog-operator-68c6474976-hwl66\" (UID: \"6a5359b9-b29a-4c86-8dc8-f00b659cecb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.432646 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3ea57e2-def2-4a73-a86b-75be99e36e46-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4kvb\" (UID: \"b3ea57e2-def2-4a73-a86b-75be99e36e46\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.433861 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7-proxy-tls\") pod \"machine-config-controller-84d6567774-w8vrs\" (UID: \"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.434772 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3071e162-d262-4732-81ca-10bb9b507321-service-ca-bundle\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.434816 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3071e162-d262-4732-81ca-10bb9b507321-default-certificate\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.435843 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7zg7\" (UniqueName: \"kubernetes.io/projected/f00e2406-a55b-4e28-bed9-a060b0780301-kube-api-access-h7zg7\") pod \"console-operator-58897d9998-h4zk8\" (UID: \"f00e2406-a55b-4e28-bed9-a060b0780301\") " pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.445300 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b1bae0d8-92c9-40e9-ad8d-cc01467c8d93-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-72z7j\" (UID: \"b1bae0d8-92c9-40e9-ad8d-cc01467c8d93\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.448227 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/809781cd-b87f-423a-957c-0d20e074306e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-t6ljp\" (UID: \"809781cd-b87f-423a-957c-0d20e074306e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.448518 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8d31d88d-2e34-4b55-b843-b8a67b957680-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.448551 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/41fade82-0d8d-41b2-805e-8a92ffa97cf3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-r4jmd\" (UID: \"41fade82-0d8d-41b2-805e-8a92ffa97cf3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.448568 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.454445 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4gbkr"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.456869 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.459496 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt4vb\" (UniqueName: \"kubernetes.io/projected/41fade82-0d8d-41b2-805e-8a92ffa97cf3-kube-api-access-rt4vb\") pod \"control-plane-machine-set-operator-78cbb6b69f-r4jmd\" (UID: \"41fade82-0d8d-41b2-805e-8a92ffa97cf3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.460341 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-lg2rd"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.467007 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vwb2\" (UniqueName: \"kubernetes.io/projected/3071e162-d262-4732-81ca-10bb9b507321-kube-api-access-9vwb2\") pod \"router-default-5444994796-nzgmv\" (UID: \"3071e162-d262-4732-81ca-10bb9b507321\") " pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.482251 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.493668 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.496413 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.496688 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.496890 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-registration-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.496945 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/814d776b-73c6-4354-8195-da5d3ea2d5cb-certs\") pod \"machine-config-server-hxzjr\" (UID: \"814d776b-73c6-4354-8195-da5d3ea2d5cb\") " pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.496969 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-csi-data-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.496985 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71548ff6-f831-48ba-af51-99fe431c447a-config-volume\") pod \"dns-default-nm78h\" (UID: \"71548ff6-f831-48ba-af51-99fe431c447a\") " pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497003 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82lfl\" (UniqueName: \"kubernetes.io/projected/2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80-kube-api-access-82lfl\") pod \"ingress-canary-vcmxn\" (UID: \"2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80\") " pod="openshift-ingress-canary/ingress-canary-vcmxn" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497039 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/71548ff6-f831-48ba-af51-99fe431c447a-metrics-tls\") pod \"dns-default-nm78h\" (UID: \"71548ff6-f831-48ba-af51-99fe431c447a\") " pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497054 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80-cert\") pod \"ingress-canary-vcmxn\" (UID: \"2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80\") " pod="openshift-ingress-canary/ingress-canary-vcmxn" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497078 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-plugins-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497105 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-mountpoint-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497119 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-socket-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497195 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/03c35cd8-4a1b-4847-a5f2-0fe0e884d191-signing-key\") pod \"service-ca-9c57cc56f-8ppnm\" (UID: \"03c35cd8-4a1b-4847-a5f2-0fe0e884d191\") " pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497213 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/814d776b-73c6-4354-8195-da5d3ea2d5cb-node-bootstrap-token\") pod \"machine-config-server-hxzjr\" (UID: \"814d776b-73c6-4354-8195-da5d3ea2d5cb\") " pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497266 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/03c35cd8-4a1b-4847-a5f2-0fe0e884d191-signing-cabundle\") pod \"service-ca-9c57cc56f-8ppnm\" (UID: \"03c35cd8-4a1b-4847-a5f2-0fe0e884d191\") " pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497282 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l86s9\" (UniqueName: \"kubernetes.io/projected/71548ff6-f831-48ba-af51-99fe431c447a-kube-api-access-l86s9\") pod \"dns-default-nm78h\" (UID: \"71548ff6-f831-48ba-af51-99fe431c447a\") " pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvjsx\" (UniqueName: \"kubernetes.io/projected/814d776b-73c6-4354-8195-da5d3ea2d5cb-kube-api-access-xvjsx\") pod \"machine-config-server-hxzjr\" (UID: \"814d776b-73c6-4354-8195-da5d3ea2d5cb\") " pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtkgt\" (UniqueName: \"kubernetes.io/projected/03c35cd8-4a1b-4847-a5f2-0fe0e884d191-kube-api-access-vtkgt\") pod \"service-ca-9c57cc56f-8ppnm\" (UID: \"03c35cd8-4a1b-4847-a5f2-0fe0e884d191\") " pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497371 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2pqg\" (UniqueName: \"kubernetes.io/projected/bb502645-30c6-437d-abc3-28de80105939-kube-api-access-m2pqg\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: E0219 19:20:43.497395 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:43.997379481 +0000 UTC m=+143.609729795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.497460 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-csi-data-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.498026 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-mountpoint-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.498384 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-plugins-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.498807 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9xdq\" (UniqueName: \"kubernetes.io/projected/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-kube-api-access-r9xdq\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.499015 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-registration-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.499059 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bb502645-30c6-437d-abc3-28de80105939-socket-dir\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.499405 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.503304 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/71548ff6-f831-48ba-af51-99fe431c447a-metrics-tls\") pod \"dns-default-nm78h\" (UID: \"71548ff6-f831-48ba-af51-99fe431c447a\") " pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.504738 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.504954 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/03c35cd8-4a1b-4847-a5f2-0fe0e884d191-signing-key\") pod \"service-ca-9c57cc56f-8ppnm\" (UID: \"03c35cd8-4a1b-4847-a5f2-0fe0e884d191\") " pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.507016 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/814d776b-73c6-4354-8195-da5d3ea2d5cb-certs\") pod \"machine-config-server-hxzjr\" (UID: \"814d776b-73c6-4354-8195-da5d3ea2d5cb\") " pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.507240 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/814d776b-73c6-4354-8195-da5d3ea2d5cb-node-bootstrap-token\") pod \"machine-config-server-hxzjr\" (UID: \"814d776b-73c6-4354-8195-da5d3ea2d5cb\") " pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.508703 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgsrz\" (UniqueName: \"kubernetes.io/projected/0d5e5981-45e4-4970-bff2-17a6087915e9-kube-api-access-xgsrz\") pod \"collect-profiles-29525475-wskf7\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.509393 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80-cert\") pod \"ingress-canary-vcmxn\" (UID: \"2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80\") " pod="openshift-ingress-canary/ingress-canary-vcmxn" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.526004 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/809781cd-b87f-423a-957c-0d20e074306e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-t6ljp\" (UID: \"809781cd-b87f-423a-957c-0d20e074306e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.533729 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-gtjsk"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.542273 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71548ff6-f831-48ba-af51-99fe431c447a-config-volume\") pod \"dns-default-nm78h\" (UID: \"71548ff6-f831-48ba-af51-99fe431c447a\") " pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.543659 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzk45\" (UniqueName: \"kubernetes.io/projected/a159fd13-de0a-46d3-971f-fb7c2fc652bd-kube-api-access-zzk45\") pod \"service-ca-operator-777779d784-4jhs8\" (UID: \"a159fd13-de0a-46d3-971f-fb7c2fc652bd\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.555436 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.556645 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/03c35cd8-4a1b-4847-a5f2-0fe0e884d191-signing-cabundle\") pod \"service-ca-9c57cc56f-8ppnm\" (UID: \"03c35cd8-4a1b-4847-a5f2-0fe0e884d191\") " pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.572450 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ndzb8"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.572873 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsgwl\" (UniqueName: \"kubernetes.io/projected/b1bae0d8-92c9-40e9-ad8d-cc01467c8d93-kube-api-access-zsgwl\") pod \"multus-admission-controller-857f4d67dd-72z7j\" (UID: \"b1bae0d8-92c9-40e9-ad8d-cc01467c8d93\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" Feb 19 19:20:43 crc kubenswrapper[4722]: W0219 19:20:43.592945 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecc880c8_beb9_4081_8af6_64d2fa857901.slice/crio-e844f80f4659e52890f34ecd1020791a32cbf271dac55e2d79171097c0004545 WatchSource:0}: Error finding container e844f80f4659e52890f34ecd1020791a32cbf271dac55e2d79171097c0004545: Status 404 returned error can't find the container with id e844f80f4659e52890f34ecd1020791a32cbf271dac55e2d79171097c0004545 Feb 19 19:20:43 crc kubenswrapper[4722]: W0219 19:20:43.595144 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3071e162_d262_4732_81ca_10bb9b507321.slice/crio-4e84b454d40153b5e9bc92e4d009a60cc8856058e2ebe2b49ffcfce633588b37 WatchSource:0}: Error finding container 4e84b454d40153b5e9bc92e4d009a60cc8856058e2ebe2b49ffcfce633588b37: Status 404 returned error can't find the container with id 4e84b454d40153b5e9bc92e4d009a60cc8856058e2ebe2b49ffcfce633588b37 Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.597882 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: E0219 19:20:43.601439 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.101406394 +0000 UTC m=+143.713756718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.612491 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv46w\" (UniqueName: \"kubernetes.io/projected/f5ac7e96-c772-449a-9e9d-d7dabfc6974e-kube-api-access-vv46w\") pod \"machine-config-operator-74547568cd-c5rfs\" (UID: \"f5ac7e96-c772-449a-9e9d-d7dabfc6974e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.617086 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.621025 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.628271 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-bound-sa-token\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.646943 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9mt8\" (UniqueName: \"kubernetes.io/projected/9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7-kube-api-access-t9mt8\") pod \"machine-config-controller-84d6567774-w8vrs\" (UID: \"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.669462 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2csz\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-kube-api-access-d2csz\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.685900 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-p99c4"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.686421 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqdj9\" (UniqueName: \"kubernetes.io/projected/187676b8-1029-4153-9da5-6614e9b7892e-kube-api-access-hqdj9\") pod \"console-f9d7485db-txlzt\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.698826 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:43 crc kubenswrapper[4722]: E0219 19:20:43.699016 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.198989271 +0000 UTC m=+143.811339595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.699049 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: E0219 19:20:43.699440 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.199432075 +0000 UTC m=+143.811782399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.708873 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f25b1c29-b400-4bd5-8e63-ac31629a0aa2-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvksz\" (UID: \"f25b1c29-b400-4bd5-8e63-ac31629a0aa2\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:43 crc kubenswrapper[4722]: W0219 19:20:43.721719 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51679292_9818_418a_98d6_c442dc7d28e2.slice/crio-ee3f3b04f30841b8874fa8bdc4b3725d0b4c9e7779d65da0483477ea796c9fa9 WatchSource:0}: Error finding container ee3f3b04f30841b8874fa8bdc4b3725d0b4c9e7779d65da0483477ea796c9fa9: Status 404 returned error can't find the container with id ee3f3b04f30841b8874fa8bdc4b3725d0b4c9e7779d65da0483477ea796c9fa9 Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.730761 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2ce4ba5c-9c53-4a07-a57d-3c3532449ae8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-759tp\" (UID: \"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.747427 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/669833fe-83b5-4d4a-a78c-c360789f754b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-z9r7q\" (UID: \"669833fe-83b5-4d4a-a78c-c360789f754b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.761799 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.768054 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.769466 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.776855 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbbxv\" (UniqueName: \"kubernetes.io/projected/2d33e000-0a81-4601-8120-52dacf0b5d6b-kube-api-access-zbbxv\") pod \"migrator-59844c95c7-9sqrc\" (UID: \"2d33e000-0a81-4601-8120-52dacf0b5d6b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.785792 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwxck\" (UniqueName: \"kubernetes.io/projected/e040b47b-3688-40e2-a410-0dfa43ad8ef3-kube-api-access-mwxck\") pod \"package-server-manager-789f6589d5-vh5vl\" (UID: \"e040b47b-3688-40e2-a410-0dfa43ad8ef3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.796467 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.797653 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-h4zk8"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.799672 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:43 crc kubenswrapper[4722]: E0219 19:20:43.799857 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.299836522 +0000 UTC m=+143.912186846 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.799968 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:43 crc kubenswrapper[4722]: E0219 19:20:43.800290 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.300282577 +0000 UTC m=+143.912632901 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.805488 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8l4m\" (UniqueName: \"kubernetes.io/projected/b3ea57e2-def2-4a73-a86b-75be99e36e46-kube-api-access-j8l4m\") pod \"openshift-controller-manager-operator-756b6f6bc6-h4kvb\" (UID: \"b3ea57e2-def2-4a73-a86b-75be99e36e46\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.807960 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.812583 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.822108 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.825958 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.826478 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.828772 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88gjx\" (UniqueName: \"kubernetes.io/projected/6a5359b9-b29a-4c86-8dc8-f00b659cecb0-kube-api-access-88gjx\") pod \"catalog-operator-68c6474976-hwl66\" (UID: \"6a5359b9-b29a-4c86-8dc8-f00b659cecb0\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.843789 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.845815 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82lfl\" (UniqueName: \"kubernetes.io/projected/2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80-kube-api-access-82lfl\") pod \"ingress-canary-vcmxn\" (UID: \"2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80\") " pod="openshift-ingress-canary/ingress-canary-vcmxn" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.849861 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" event={"ID":"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b","Type":"ContainerStarted","Data":"937a76b33078699fdd749624ad5fbc055d6636f974ccd1fc5e83353583659e23"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.853644 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" event={"ID":"38df6625-e726-49e8-9bff-561442dcea53","Type":"ContainerStarted","Data":"980a7eef68679d7d7667805ee210f572632e488a6a12ac11df8fce3e620735b4"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.853687 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" event={"ID":"38df6625-e726-49e8-9bff-561442dcea53","Type":"ContainerStarted","Data":"9d1191245297017781f6f0d59c9e68f7b88c6be5d638855d55459cf690589f08"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.855877 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.856294 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" event={"ID":"a7788f82-4e6b-4d89-b009-0eca5b234009","Type":"ContainerStarted","Data":"a3c583618743072d3a8c63892ba8aed32eb325aba749dae0a0acd16ff2007d50"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.856322 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" event={"ID":"a7788f82-4e6b-4d89-b009-0eca5b234009","Type":"ContainerStarted","Data":"a026f18c23225d1771444fa7c9ac77f8b95183a72ab6190d1f7080aa1962c8b8"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.857605 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-nzgmv" event={"ID":"3071e162-d262-4732-81ca-10bb9b507321","Type":"ContainerStarted","Data":"4e84b454d40153b5e9bc92e4d009a60cc8856058e2ebe2b49ffcfce633588b37"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.861316 4722 generic.go:334] "Generic (PLEG): container finished" podID="5693eddb-45a4-4cee-acb8-d3c0f23d16b8" containerID="1eaef094f51d48c22c87113b7f264cf4f078eb98ddaab98503078950021a15ac" exitCode=0 Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.861394 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" event={"ID":"5693eddb-45a4-4cee-acb8-d3c0f23d16b8","Type":"ContainerDied","Data":"1eaef094f51d48c22c87113b7f264cf4f078eb98ddaab98503078950021a15ac"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.861423 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" event={"ID":"5693eddb-45a4-4cee-acb8-d3c0f23d16b8","Type":"ContainerStarted","Data":"f79630f1115840f2d9470a908c6aa100f0c2f7df67fa6c49602d64b93261c839"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.865040 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.866354 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" event={"ID":"bf8b7b84-382a-410f-8dea-c4f485402a77","Type":"ContainerStarted","Data":"4524fe7e56fb120ed1f10ce6083b180ce28a4063125657070a3dd348cfebd5dd"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.866392 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" event={"ID":"bf8b7b84-382a-410f-8dea-c4f485402a77","Type":"ContainerStarted","Data":"b449e993c7b506aa07679f65f11ca216831fe291a98147e01c75d4ae55f5d767"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.866404 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" event={"ID":"bf8b7b84-382a-410f-8dea-c4f485402a77","Type":"ContainerStarted","Data":"db7a9df904e922606f774e25d285494e99875d1b8fcc171af6ed4dca16c8ade1"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.868510 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2pqg\" (UniqueName: \"kubernetes.io/projected/bb502645-30c6-437d-abc3-28de80105939-kube-api-access-m2pqg\") pod \"csi-hostpathplugin-kqs9s\" (UID: \"bb502645-30c6-437d-abc3-28de80105939\") " pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.871640 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.872287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-h4zk8" event={"ID":"f00e2406-a55b-4e28-bed9-a060b0780301","Type":"ContainerStarted","Data":"82508cfbd650f98148fb226498a6f0165ea19ee56f6b84eb2bb880b85c8acc4e"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.874407 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" event={"ID":"26779d4b-27e7-4bac-a4d8-5c312a6cec13","Type":"ContainerStarted","Data":"e9a59274a7088a93a2ec77294d99e283bb869e2553d73855111baa2c25d9f8dc"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.874464 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" event={"ID":"26779d4b-27e7-4bac-a4d8-5c312a6cec13","Type":"ContainerStarted","Data":"bacbd1ad79cad6f294317fd43a808d54eeaabb55461701a8a110ac85e14420bb"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.878835 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.887623 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l86s9\" (UniqueName: \"kubernetes.io/projected/71548ff6-f831-48ba-af51-99fe431c447a-kube-api-access-l86s9\") pod \"dns-default-nm78h\" (UID: \"71548ff6-f831-48ba-af51-99fe431c447a\") " pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.888835 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp"] Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.901103 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:43 crc kubenswrapper[4722]: E0219 19:20:43.901447 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.401427898 +0000 UTC m=+144.013778222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.910443 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvjsx\" (UniqueName: \"kubernetes.io/projected/814d776b-73c6-4354-8195-da5d3ea2d5cb-kube-api-access-xvjsx\") pod \"machine-config-server-hxzjr\" (UID: \"814d776b-73c6-4354-8195-da5d3ea2d5cb\") " pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.912207 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" event={"ID":"ecc880c8-beb9-4081-8af6-64d2fa857901","Type":"ContainerStarted","Data":"e844f80f4659e52890f34ecd1020791a32cbf271dac55e2d79171097c0004545"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.919248 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" event={"ID":"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f","Type":"ContainerStarted","Data":"f0e89d11d34992a9d3e2fd6ed43adbfd1bd19f8094145800e8d76f0e8ae93eaf"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.925928 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-lg2rd" event={"ID":"b7b80c35-8f0b-4f44-af31-0b84ebddd4b8","Type":"ContainerStarted","Data":"04081e8ba931886e1736ede667ab583c82c3730fd987315d4b52e12ed7c811d5"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.926541 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hxzjr" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.932662 4722 generic.go:334] "Generic (PLEG): container finished" podID="8c255c5e-d6d9-4772-9151-0065df6dc00d" containerID="5c44c055c09feba5ea63deb960006d6c67d2cfab710a3a6c6d5997ee7bb87a61" exitCode=0 Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.932939 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" event={"ID":"8c255c5e-d6d9-4772-9151-0065df6dc00d","Type":"ContainerDied","Data":"5c44c055c09feba5ea63deb960006d6c67d2cfab710a3a6c6d5997ee7bb87a61"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.932980 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" event={"ID":"8c255c5e-d6d9-4772-9151-0065df6dc00d","Type":"ContainerStarted","Data":"accff4e6dd3fb27abdc912d4e1d0252a56be908fc7ae0600a9f72f81ca272868"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.942370 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" event={"ID":"2d21a014-83a9-43d9-9cdd-5e0897757c90","Type":"ContainerStarted","Data":"0930f6cbddbd346945e235439f5391cf2978ec9a51ea4717faa4f07b16a397fd"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.942636 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.946771 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vcmxn" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.948078 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtkgt\" (UniqueName: \"kubernetes.io/projected/03c35cd8-4a1b-4847-a5f2-0fe0e884d191-kube-api-access-vtkgt\") pod \"service-ca-9c57cc56f-8ppnm\" (UID: \"03c35cd8-4a1b-4847-a5f2-0fe0e884d191\") " pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.948464 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" event={"ID":"c1782da0-924a-481b-b0fc-20050e168591","Type":"ContainerStarted","Data":"91e29c4c51cd956e7890c0dbe940cd28aaff5babb9d72cd9fb735cea262c06b2"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.956599 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" event={"ID":"51679292-9818-418a-98d6-c442dc7d28e2","Type":"ContainerStarted","Data":"ee3f3b04f30841b8874fa8bdc4b3725d0b4c9e7779d65da0483477ea796c9fa9"} Feb 19 19:20:43 crc kubenswrapper[4722]: W0219 19:20:43.959843 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c9da917_db10_4eba_bdff_f68354e8d4a6.slice/crio-1c92713ba01722e88a374e7a603035ffb7e93848d2428e7ac19aa0af291721e8 WatchSource:0}: Error finding container 1c92713ba01722e88a374e7a603035ffb7e93848d2428e7ac19aa0af291721e8: Status 404 returned error can't find the container with id 1c92713ba01722e88a374e7a603035ffb7e93848d2428e7ac19aa0af291721e8 Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.961345 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" event={"ID":"4ded6995-db61-4962-a375-ba80816b8df9","Type":"ContainerStarted","Data":"0fdf4a7637cb5402705fa920589e29808535eef70605f1728816ba11c57d64e5"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.961392 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" event={"ID":"4ded6995-db61-4962-a375-ba80816b8df9","Type":"ContainerStarted","Data":"6e1b8dc29249f786b414083b626373283ac9d3f4f6727c121afc4a975d983b31"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.961758 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.963972 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.964283 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" event={"ID":"01bb1078-2d76-42f4-919f-3d1b73a61fd4","Type":"ContainerStarted","Data":"f2e89e44666f131e73fb4bba7527eb9deff2a5a021dd6c53a89f611465012a71"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.975574 4722 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xn22j container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.976059 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" podUID="4ded6995-db61-4962-a375-ba80816b8df9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.977415 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" event={"ID":"47339628-7112-4f7a-b949-fef983428ebe","Type":"ContainerStarted","Data":"39ba28e8024c70b504ad794063096c96e6db73f940777aa78abdfcf3c54bcde5"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.983567 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" event={"ID":"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc","Type":"ContainerStarted","Data":"29b5691d6a5701744331cdb2fe2e088cb8011eb8feb9a1593c6083fcbeb3e44e"} Feb 19 19:20:43 crc kubenswrapper[4722]: I0219 19:20:43.983609 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" event={"ID":"d33763f9-ec4f-4337-b9d4-f5c25ec6eabc","Type":"ContainerStarted","Data":"dcec5d7923af0ae25f7c1d25aefaa3e6bfe154ea00901d8bca40695a3dfc2f0c"} Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.003610 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.004409 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" event={"ID":"cb6886b7-9193-4c89-96c8-64b61c3251a4","Type":"ContainerStarted","Data":"d0c096f9abea14bd89e01cd5df78cfd43109b66f0678b624949e1ec87cdc1cd4"} Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.004974 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.504912743 +0000 UTC m=+144.117263127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.011444 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8"] Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.034427 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" event={"ID":"41fade82-0d8d-41b2-805e-8a92ffa97cf3","Type":"ContainerStarted","Data":"b7af01b146418b2959ab63e4c1e4bd3213696626eb16332d455a5d5ff7b805d8"} Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.072654 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7"] Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.089327 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.104529 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.105817 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.605796346 +0000 UTC m=+144.218146680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.119930 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.122465 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-txlzt"] Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.201784 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q"] Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.205953 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.207906 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.707888588 +0000 UTC m=+144.320238912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.232632 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" Feb 19 19:20:44 crc kubenswrapper[4722]: W0219 19:20:44.240922 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda159fd13_de0a_46d3_971f_fb7c2fc652bd.slice/crio-1fc5657702610eb06786b925025bb358aa2d401a40d7767a88efb8bc8f7c2afe WatchSource:0}: Error finding container 1fc5657702610eb06786b925025bb358aa2d401a40d7767a88efb8bc8f7c2afe: Status 404 returned error can't find the container with id 1fc5657702610eb06786b925025bb358aa2d401a40d7767a88efb8bc8f7c2afe Feb 19 19:20:44 crc kubenswrapper[4722]: W0219 19:20:44.253795 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod187676b8_1029_4153_9da5_6614e9b7892e.slice/crio-fdc6fea50eb108128f2352057c1c724769c297d884371189de3a59a1b99e73b3 WatchSource:0}: Error finding container fdc6fea50eb108128f2352057c1c724769c297d884371189de3a59a1b99e73b3: Status 404 returned error can't find the container with id fdc6fea50eb108128f2352057c1c724769c297d884371189de3a59a1b99e73b3 Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.307063 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.307306 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.807284172 +0000 UTC m=+144.419634496 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: W0219 19:20:44.408203 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod814d776b_73c6_4354_8195_da5d3ea2d5cb.slice/crio-26e9ca32316aff4522d73bf7d95dfa5fc40ec32a584b55e39d6816351d39f964 WatchSource:0}: Error finding container 26e9ca32316aff4522d73bf7d95dfa5fc40ec32a584b55e39d6816351d39f964: Status 404 returned error can't find the container with id 26e9ca32316aff4522d73bf7d95dfa5fc40ec32a584b55e39d6816351d39f964 Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.408312 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.408659 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:44.90864229 +0000 UTC m=+144.520992674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.479398 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs"] Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.510522 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.510686 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.010659769 +0000 UTC m=+144.623010093 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.510916 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.511249 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.011233537 +0000 UTC m=+144.623583861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.612582 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.612863 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.112847953 +0000 UTC m=+144.725198277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.655422 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp"] Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.675305 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs"] Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.716576 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.717328 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.21730622 +0000 UTC m=+144.829656544 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.725433 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-pxpb9" podStartSLOduration=124.725413068 podStartE2EDuration="2m4.725413068s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:44.717780065 +0000 UTC m=+144.330130389" watchObservedRunningTime="2026-02-19 19:20:44.725413068 +0000 UTC m=+144.337763392" Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.744990 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-72z7j"] Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.761420 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" podStartSLOduration=123.761401514 podStartE2EDuration="2m3.761401514s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:44.759296047 +0000 UTC m=+144.371646391" watchObservedRunningTime="2026-02-19 19:20:44.761401514 +0000 UTC m=+144.373751838" Feb 19 19:20:44 crc kubenswrapper[4722]: W0219 19:20:44.773187 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ce4ba5c_9c53_4a07_a57d_3c3532449ae8.slice/crio-f21b7ed4b1229a70fc4e140ce7a8830d62bdeb54c0f4de294bb207b20a466fb4 WatchSource:0}: Error finding container f21b7ed4b1229a70fc4e140ce7a8830d62bdeb54c0f4de294bb207b20a466fb4: Status 404 returned error can't find the container with id f21b7ed4b1229a70fc4e140ce7a8830d62bdeb54c0f4de294bb207b20a466fb4 Feb 19 19:20:44 crc kubenswrapper[4722]: W0219 19:20:44.813888 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1bae0d8_92c9_40e9_ad8d_cc01467c8d93.slice/crio-dce375001e0ee1f3aa1f865f8919465a72d00bc9d9364fbf21f9da42eded8771 WatchSource:0}: Error finding container dce375001e0ee1f3aa1f865f8919465a72d00bc9d9364fbf21f9da42eded8771: Status 404 returned error can't find the container with id dce375001e0ee1f3aa1f865f8919465a72d00bc9d9364fbf21f9da42eded8771 Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.818969 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.820047 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.320019721 +0000 UTC m=+144.932370085 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.920905 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:44 crc kubenswrapper[4722]: E0219 19:20:44.922457 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.422445562 +0000 UTC m=+145.034795886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.950335 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-kqs9s"] Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.957546 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp"] Feb 19 19:20:44 crc kubenswrapper[4722]: I0219 19:20:44.969535 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc"] Feb 19 19:20:44 crc kubenswrapper[4722]: W0219 19:20:44.975078 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb502645_30c6_437d_abc3_28de80105939.slice/crio-2342f3e915956c09366a82cda8e267b526cd8f14214d855ab7b099b581846e3d WatchSource:0}: Error finding container 2342f3e915956c09366a82cda8e267b526cd8f14214d855ab7b099b581846e3d: Status 404 returned error can't find the container with id 2342f3e915956c09366a82cda8e267b526cd8f14214d855ab7b099b581846e3d Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.000594 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-z8gcw" podStartSLOduration=125.00057357 podStartE2EDuration="2m5.00057357s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:44.999783745 +0000 UTC m=+144.612134069" watchObservedRunningTime="2026-02-19 19:20:45.00057357 +0000 UTC m=+144.612923914" Feb 19 19:20:45 crc kubenswrapper[4722]: W0219 19:20:45.020424 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod809781cd_b87f_423a_957c_0d20e074306e.slice/crio-20774eac2ac81fdc823c4f6cc47a68777947365e460cb18686cb34b06004aebb WatchSource:0}: Error finding container 20774eac2ac81fdc823c4f6cc47a68777947365e460cb18686cb34b06004aebb: Status 404 returned error can't find the container with id 20774eac2ac81fdc823c4f6cc47a68777947365e460cb18686cb34b06004aebb Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.021562 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.021709 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.521681103 +0000 UTC m=+145.134031427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.022521 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.022865 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.5228509 +0000 UTC m=+145.135201224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.035025 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66"] Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.048499 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" event={"ID":"cb6886b7-9193-4c89-96c8-64b61c3251a4","Type":"ContainerStarted","Data":"3e07f956af5d9519f0aa46f0dd27ff59f1b20703afc1f6ad3a69b934175a5145"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.048810 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.052405 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4gbkr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.052458 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" podUID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.053021 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" event={"ID":"2d21a014-83a9-43d9-9cdd-5e0897757c90","Type":"ContainerStarted","Data":"4cfcd0939748cd1cb3b3ea5c2d67954b39a00eee17635f5db99d67b1fe3bc5db"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.053806 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz"] Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.055100 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" event={"ID":"51679292-9818-418a-98d6-c442dc7d28e2","Type":"ContainerStarted","Data":"e69bccb77c203b1feff23c4b9aad2d72dd2a1b7a82bb0d0989b26020a120dfe4"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.057246 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" event={"ID":"7c9da917-db10-4eba-bdff-f68354e8d4a6","Type":"ContainerStarted","Data":"1c92713ba01722e88a374e7a603035ffb7e93848d2428e7ac19aa0af291721e8"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.059299 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" event={"ID":"669833fe-83b5-4d4a-a78c-c360789f754b","Type":"ContainerStarted","Data":"ebd746d1c09688e84817367f69c217e579f80a9e47cf50d0256218caf1de358d"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.062258 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-nzgmv" event={"ID":"3071e162-d262-4732-81ca-10bb9b507321","Type":"ContainerStarted","Data":"00dbbdf46cd7472f445a575505944f76f6c027a639dda8c496d34165cf21eec9"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.079547 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.079579 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" event={"ID":"ecc880c8-beb9-4081-8af6-64d2fa857901","Type":"ContainerStarted","Data":"025da636ca5ec87dbbbe0099c0cb554b53402034ea5236acbe0c2f2324b80d4e"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.086735 4722 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-ndzb8 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.086798 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" podUID="ecc880c8-beb9-4081-8af6-64d2fa857901" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.099655 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" event={"ID":"38df6625-e726-49e8-9bff-561442dcea53","Type":"ContainerStarted","Data":"d57bee2e79a376f24991d94b9a7298bc56e0eec9133577bf5557eee2bfa5f917"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.102067 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vcmxn"] Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.116859 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl"] Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.124972 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.125996 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.625980284 +0000 UTC m=+145.238330608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.164690 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" event={"ID":"01bb1078-2d76-42f4-919f-3d1b73a61fd4","Type":"ContainerStarted","Data":"248c060a2310d665aa833656479a64a87327d7b2dd4e362af6c1c55dfa6c5ecd"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.165024 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nm78h"] Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.166950 4722 generic.go:334] "Generic (PLEG): container finished" podID="47339628-7112-4f7a-b949-fef983428ebe" containerID="efbbd4ce089c3aed8522c44ea57e3b1991a8206547e6f741cb11181d7ef0e7b0" exitCode=0 Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.167185 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" event={"ID":"47339628-7112-4f7a-b949-fef983428ebe","Type":"ContainerDied","Data":"efbbd4ce089c3aed8522c44ea57e3b1991a8206547e6f741cb11181d7ef0e7b0"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.178036 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8ppnm"] Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.181736 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" event={"ID":"c02c0f7a-9c0e-4d91-aca7-9648bace7d2f","Type":"ContainerStarted","Data":"c79963fb50d6dec9455c971ee05aceffe009984dbf652eb69dfcd58ebd97ad44"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.207721 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-txlzt" event={"ID":"187676b8-1029-4153-9da5-6614e9b7892e","Type":"ContainerStarted","Data":"fdc6fea50eb108128f2352057c1c724769c297d884371189de3a59a1b99e73b3"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.208858 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb"] Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.213287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" event={"ID":"809781cd-b87f-423a-957c-0d20e074306e","Type":"ContainerStarted","Data":"20774eac2ac81fdc823c4f6cc47a68777947365e460cb18686cb34b06004aebb"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.224051 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-lg2rd" event={"ID":"b7b80c35-8f0b-4f44-af31-0b84ebddd4b8","Type":"ContainerStarted","Data":"b249b0514a8cadd113dd409bbe53a0666baff045911e6906f6de48eee32345aa"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.224400 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-lg2rd" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.226984 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.227804 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.727788436 +0000 UTC m=+145.340138760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.230220 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-lg2rd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.230264 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lg2rd" podUID="b7b80c35-8f0b-4f44-af31-0b84ebddd4b8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.235549 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-h4zk8" event={"ID":"f00e2406-a55b-4e28-bed9-a060b0780301","Type":"ContainerStarted","Data":"5425dc9b7a416e558af8638bac8fc3e5f13a0614b6c23eac81fa13b38629a876"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.235842 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.237992 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" event={"ID":"a159fd13-de0a-46d3-971f-fb7c2fc652bd","Type":"ContainerStarted","Data":"1fc5657702610eb06786b925025bb358aa2d401a40d7767a88efb8bc8f7c2afe"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.245874 4722 patch_prober.go:28] interesting pod/console-operator-58897d9998-h4zk8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.245907 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-h4zk8" podUID="f00e2406-a55b-4e28-bed9-a060b0780301" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.246929 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" event={"ID":"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7","Type":"ContainerStarted","Data":"5a94da18c8904d8a56c548e84229a34149eefe8b670a2f7f780f7da2003a71f8"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.254068 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" event={"ID":"c1782da0-924a-481b-b0fc-20050e168591","Type":"ContainerStarted","Data":"c3f6cf9c254cddfd544511ce0603d2e13a0cf98656ff97b926bac52ca75ade34"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.255428 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.258524 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" event={"ID":"41fade82-0d8d-41b2-805e-8a92ffa97cf3","Type":"ContainerStarted","Data":"35b587c3f20b8140ab0f86bba90ba106fa4be4654fc0e85a4c915fa3bc9aa2c1"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.262118 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" event={"ID":"bb502645-30c6-437d-abc3-28de80105939","Type":"ContainerStarted","Data":"2342f3e915956c09366a82cda8e267b526cd8f14214d855ab7b099b581846e3d"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.267026 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" event={"ID":"f5ac7e96-c772-449a-9e9d-d7dabfc6974e","Type":"ContainerStarted","Data":"88829366351bf7263601b984ad15bcb303806ca9ba3d2c5c1f86149792538e76"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.268242 4722 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-hj8tk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.268273 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" podUID="c1782da0-924a-481b-b0fc-20050e168591" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.269677 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" event={"ID":"0d5e5981-45e4-4970-bff2-17a6087915e9","Type":"ContainerStarted","Data":"262b347f2b9a906cc2a369ed3ff2e9b2acf60ad338b20154c8999adf62f8801a"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.271236 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" event={"ID":"b1bae0d8-92c9-40e9-ad8d-cc01467c8d93","Type":"ContainerStarted","Data":"dce375001e0ee1f3aa1f865f8919465a72d00bc9d9364fbf21f9da42eded8771"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.278639 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hxzjr" event={"ID":"814d776b-73c6-4354-8195-da5d3ea2d5cb","Type":"ContainerStarted","Data":"26e9ca32316aff4522d73bf7d95dfa5fc40ec32a584b55e39d6816351d39f964"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.284996 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc" event={"ID":"2d33e000-0a81-4601-8120-52dacf0b5d6b","Type":"ContainerStarted","Data":"47b1fb0e415251608ebf2cadce332bcf26820d58f70e5318532d73180da811f8"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.289678 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" event={"ID":"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8","Type":"ContainerStarted","Data":"f21b7ed4b1229a70fc4e140ce7a8830d62bdeb54c0f4de294bb207b20a466fb4"} Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.292123 4722 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xn22j container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.292232 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" podUID="4ded6995-db61-4962-a375-ba80816b8df9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.328491 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.329912 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.829894068 +0000 UTC m=+145.442244402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.347857 4722 csr.go:261] certificate signing request csr-kbqk4 is approved, waiting to be issued Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.353221 4722 csr.go:257] certificate signing request csr-kbqk4 is issued Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.405772 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-576vp" podStartSLOduration=125.405754314 podStartE2EDuration="2m5.405754314s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:45.403872263 +0000 UTC m=+145.016222597" watchObservedRunningTime="2026-02-19 19:20:45.405754314 +0000 UTC m=+145.018104638" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.432924 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.435548 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:45.935536072 +0000 UTC m=+145.547886396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.485223 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.488860 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.488907 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.533741 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.533968 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.033946616 +0000 UTC m=+145.646296950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.534228 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.534719 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.03469825 +0000 UTC m=+145.647048574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.635032 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.635583 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.135567742 +0000 UTC m=+145.747918066 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.635725 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.636050 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.136042767 +0000 UTC m=+145.748393091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.711995 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-glfz9" podStartSLOduration=124.711975815 podStartE2EDuration="2m4.711975815s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:45.707613226 +0000 UTC m=+145.319963560" watchObservedRunningTime="2026-02-19 19:20:45.711975815 +0000 UTC m=+145.324326139" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.736839 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.737191 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.237176467 +0000 UTC m=+145.849526791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.761344 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-lg2rd" podStartSLOduration=124.761328467 podStartE2EDuration="2m4.761328467s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:45.759059725 +0000 UTC m=+145.371410049" watchObservedRunningTime="2026-02-19 19:20:45.761328467 +0000 UTC m=+145.373678781" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.841293 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kvv76" podStartSLOduration=124.841278203 podStartE2EDuration="2m4.841278203s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:45.839447435 +0000 UTC m=+145.451797759" watchObservedRunningTime="2026-02-19 19:20:45.841278203 +0000 UTC m=+145.453628527" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.842546 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-p99c4" podStartSLOduration=124.842539843 podStartE2EDuration="2m4.842539843s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:45.787531611 +0000 UTC m=+145.399881935" watchObservedRunningTime="2026-02-19 19:20:45.842539843 +0000 UTC m=+145.454890167" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.845568 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.846008 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.345995493 +0000 UTC m=+145.958345817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.875765 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" podStartSLOduration=125.8757464 podStartE2EDuration="2m5.8757464s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:45.874619324 +0000 UTC m=+145.486969658" watchObservedRunningTime="2026-02-19 19:20:45.8757464 +0000 UTC m=+145.488096734" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.904887 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" podStartSLOduration=124.904874347 podStartE2EDuration="2m4.904874347s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:45.904251218 +0000 UTC m=+145.516601532" watchObservedRunningTime="2026-02-19 19:20:45.904874347 +0000 UTC m=+145.517224671" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.943394 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-h4zk8" podStartSLOduration=125.943375884 podStartE2EDuration="2m5.943375884s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:45.943139057 +0000 UTC m=+145.555489381" watchObservedRunningTime="2026-02-19 19:20:45.943375884 +0000 UTC m=+145.555726208" Feb 19 19:20:45 crc kubenswrapper[4722]: I0219 19:20:45.946121 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:45 crc kubenswrapper[4722]: E0219 19:20:45.946501 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.446474203 +0000 UTC m=+146.058824527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.024249 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-r4jmd" podStartSLOduration=125.024230659 podStartE2EDuration="2m5.024230659s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.020783269 +0000 UTC m=+145.633133593" watchObservedRunningTime="2026-02-19 19:20:46.024230659 +0000 UTC m=+145.636580983" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.049337 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.049671 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.549660229 +0000 UTC m=+146.162010553 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.065614 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5xkn2" podStartSLOduration=125.065601086 podStartE2EDuration="2m5.065601086s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.064614614 +0000 UTC m=+145.676964948" watchObservedRunningTime="2026-02-19 19:20:46.065601086 +0000 UTC m=+145.677951410" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.102945 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-hxzjr" podStartSLOduration=6.102929365 podStartE2EDuration="6.102929365s" podCreationTimestamp="2026-02-19 19:20:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.102286324 +0000 UTC m=+145.714636648" watchObservedRunningTime="2026-02-19 19:20:46.102929365 +0000 UTC m=+145.715279689" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.145002 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" podStartSLOduration=125.144979044 podStartE2EDuration="2m5.144979044s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.141250215 +0000 UTC m=+145.753600549" watchObservedRunningTime="2026-02-19 19:20:46.144979044 +0000 UTC m=+145.757329368" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.158228 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.158830 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.658807965 +0000 UTC m=+146.271158289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.186541 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" podStartSLOduration=125.186518477 podStartE2EDuration="2m5.186518477s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.1837741 +0000 UTC m=+145.796124424" watchObservedRunningTime="2026-02-19 19:20:46.186518477 +0000 UTC m=+145.798868801" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.234444 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-nzgmv" podStartSLOduration=125.234425733 podStartE2EDuration="2m5.234425733s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.233606406 +0000 UTC m=+145.845956730" watchObservedRunningTime="2026-02-19 19:20:46.234425733 +0000 UTC m=+145.846776057" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.259734 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.260090 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.76007872 +0000 UTC m=+146.372429044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.324121 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" event={"ID":"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7","Type":"ContainerStarted","Data":"ef351aef5ad10b0d00005e8a1bd3c37aed6f5d4190aea65379d8fb9ecb740f99"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.324517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" event={"ID":"9a872ac6-8f75-4b9e-8ba6-77bfe127f9f7","Type":"ContainerStarted","Data":"786d0e815d19e5cfd8c18ce0bba924e838645f8b28cafacfc53ea2145457700b"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.327104 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hxzjr" event={"ID":"814d776b-73c6-4354-8195-da5d3ea2d5cb","Type":"ContainerStarted","Data":"3c1291b5cc9c8bc13b8c696b72a60f61ccde754e12194e77e57c3708607a8443"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.331223 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc" event={"ID":"2d33e000-0a81-4601-8120-52dacf0b5d6b","Type":"ContainerStarted","Data":"5501c3eb081dc14e56cf70a5f37ecc5b1996c3befffc69aff6179730a9b0b0e7"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.331261 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc" event={"ID":"2d33e000-0a81-4601-8120-52dacf0b5d6b","Type":"ContainerStarted","Data":"f03d0121879c803e8e680997d52d61aa79e2f6261d842e87232ef1598453a3ca"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.349134 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nm78h" event={"ID":"71548ff6-f831-48ba-af51-99fe431c447a","Type":"ContainerStarted","Data":"a6afc5bb0151fb3589a5f1b4d48148be1a0aca54a76983d626de4faa6cde1adf"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.349206 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nm78h" event={"ID":"71548ff6-f831-48ba-af51-99fe431c447a","Type":"ContainerStarted","Data":"a278268985d8271886d90a06c39fe28e63a5048150219f6a164b404e21d49a34"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.350773 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" event={"ID":"e040b47b-3688-40e2-a410-0dfa43ad8ef3","Type":"ContainerStarted","Data":"a1a12887074bc7205f6d66a879c77c0ef4f7f3a8bf0793e8b00c566ebf76f769"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.350805 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" event={"ID":"e040b47b-3688-40e2-a410-0dfa43ad8ef3","Type":"ContainerStarted","Data":"7ab0721f9382bb0a61d42969ec2a0016a1de1ce2798238d6001286a67f8fe122"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.350814 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" event={"ID":"e040b47b-3688-40e2-a410-0dfa43ad8ef3","Type":"ContainerStarted","Data":"77432a2b3182197a106526e3411ba4b9afdc0787a579a0b07be58df90225d826"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.351499 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.352489 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" event={"ID":"03c35cd8-4a1b-4847-a5f2-0fe0e884d191","Type":"ContainerStarted","Data":"404220fba85b8f4cd94ce26daab4420694362116ee06580bf0d58f15925b8851"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.352514 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" event={"ID":"03c35cd8-4a1b-4847-a5f2-0fe0e884d191","Type":"ContainerStarted","Data":"95f7af19afa99ff5b31abc67f240a02dc2cf9faeb66190d33c89d73e50c1a4e4"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.355214 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-19 19:15:45 +0000 UTC, rotation deadline is 2026-12-24 08:05:44.800030569 +0000 UTC Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.355245 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7380h44m58.444787319s for next certificate rotation Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.360737 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.361099 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.861086116 +0000 UTC m=+146.473436440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.376904 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" event={"ID":"01bb1078-2d76-42f4-919f-3d1b73a61fd4","Type":"ContainerStarted","Data":"ed817a0aa20bb1d426865f86a0c95ad06b1484dc026bcf82224806954734cd7f"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.389795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" event={"ID":"f5ac7e96-c772-449a-9e9d-d7dabfc6974e","Type":"ContainerStarted","Data":"a18c29ec58d9c9ca68cc394cf953a870b83d5976024ff4f95bb88828c583b002"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.389839 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" event={"ID":"f5ac7e96-c772-449a-9e9d-d7dabfc6974e","Type":"ContainerStarted","Data":"25bec7bbf6562e00857247154a6af48291dca8d73706c0f322cd88f5d9d09f1a"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.408013 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-txlzt" event={"ID":"187676b8-1029-4153-9da5-6614e9b7892e","Type":"ContainerStarted","Data":"25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.427600 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8vrs" podStartSLOduration=125.427584434 podStartE2EDuration="2m5.427584434s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.422026347 +0000 UTC m=+146.034376671" watchObservedRunningTime="2026-02-19 19:20:46.427584434 +0000 UTC m=+146.039934758" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.432441 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" event={"ID":"e8ae2d71-7578-4343-a1ba-5d414cd1cc4b","Type":"ContainerStarted","Data":"f16364e37b28f1c4da2487c503fce09e28b2dbcdc002629b73d6491128b47ad4"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.433627 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.442422 4722 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-cjtjp container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.442477 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" podUID="e8ae2d71-7578-4343-a1ba-5d414cd1cc4b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.446304 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" event={"ID":"2ce4ba5c-9c53-4a07-a57d-3c3532449ae8","Type":"ContainerStarted","Data":"8425b86e33f55bb836d35d4634244ca0cb341b776d588b53581c9e30ed8a79f9"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.448203 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" event={"ID":"2d21a014-83a9-43d9-9cdd-5e0897757c90","Type":"ContainerStarted","Data":"3eaf5e0d17a916f529d58c56a948245ee445ce0abc5547632c8daa87ac6ef597"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.449454 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vcmxn" event={"ID":"2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80","Type":"ContainerStarted","Data":"14c6449b96d273476e5619cfa1df06a0ddf8ad4b241c5b2988dd87c905f18a5b"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.449480 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vcmxn" event={"ID":"2d33aaf1-2a16-48a3-ba3a-8b0a5e66ec80","Type":"ContainerStarted","Data":"ca5e2d4d2c79c35beb41d1f1533d07d91ac580bc8fddf03fd37c30b0b401bacb"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.450831 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" event={"ID":"b1bae0d8-92c9-40e9-ad8d-cc01467c8d93","Type":"ContainerStarted","Data":"ecab273ca6f09e83f8d938ca5a5c7b07e951b67b29f42d5c366966f39555be18"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.450887 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" event={"ID":"b1bae0d8-92c9-40e9-ad8d-cc01467c8d93","Type":"ContainerStarted","Data":"ca577788ccc8003f8c58ecd6fd8b1b2af1187a4ea0c3dc28431fa6208739fa3f"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.462231 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.463381 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:46.963366933 +0000 UTC m=+146.575717257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.468911 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" podStartSLOduration=125.468896699 podStartE2EDuration="2m5.468896699s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.467393662 +0000 UTC m=+146.079743986" watchObservedRunningTime="2026-02-19 19:20:46.468896699 +0000 UTC m=+146.081247023" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.468965 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" event={"ID":"8c255c5e-d6d9-4772-9151-0065df6dc00d","Type":"ContainerStarted","Data":"8f83e76bc690bdd57bc344bba92f7c88169120f7b1c51ac629ed1570364c73e1"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.469008 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" event={"ID":"8c255c5e-d6d9-4772-9151-0065df6dc00d","Type":"ContainerStarted","Data":"ad52433eac2761f6de6f0241da97dc4626ecf152358f037cf4dab8f8735ee9bc"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.470338 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" event={"ID":"a159fd13-de0a-46d3-971f-fb7c2fc652bd","Type":"ContainerStarted","Data":"eec25437ba8663aa5716ec06c5513a50e649322d003af3053d48042e55a26585"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.483546 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" event={"ID":"6a5359b9-b29a-4c86-8dc8-f00b659cecb0","Type":"ContainerStarted","Data":"e4ec9499ce1bafa92ddfa04e1301528389d974ea0247bb6448d9b10dff8fad90"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.483593 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" event={"ID":"6a5359b9-b29a-4c86-8dc8-f00b659cecb0","Type":"ContainerStarted","Data":"f9f897c30862be4ab75fbe152f33b6d3f00620127a0529adf844cce1b1651c26"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.484197 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.485243 4722 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-hwl66 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.485295 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" podUID="6a5359b9-b29a-4c86-8dc8-f00b659cecb0" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.485624 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" event={"ID":"669833fe-83b5-4d4a-a78c-c360789f754b","Type":"ContainerStarted","Data":"527d984855f487908220402ab59141e5bdeb122d89fb13cdd73d16244ee7006d"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.489365 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:46 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:46 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:46 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.489428 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.497473 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" event={"ID":"b3ea57e2-def2-4a73-a86b-75be99e36e46","Type":"ContainerStarted","Data":"fa6a7a9c8534399f7df0974921698b483c39f23b873ed23dfb9c08f6107c7e10"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.497521 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" event={"ID":"b3ea57e2-def2-4a73-a86b-75be99e36e46","Type":"ContainerStarted","Data":"3119525aa03a6ddfce98df5b848e0a0d1c9d02178595305e6d64992ed8ef5567"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.520642 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" event={"ID":"47339628-7112-4f7a-b949-fef983428ebe","Type":"ContainerStarted","Data":"b6d79205a0222bfdb6247fb125a326c0b7cf14f6f3d22ce6eda0bdeb3bb4b4fe"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.521224 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.521923 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-gtjsk" podStartSLOduration=125.521908527 podStartE2EDuration="2m5.521908527s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.520495843 +0000 UTC m=+146.132846167" watchObservedRunningTime="2026-02-19 19:20:46.521908527 +0000 UTC m=+146.134258851" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.537577 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" event={"ID":"809781cd-b87f-423a-957c-0d20e074306e","Type":"ContainerStarted","Data":"c136cc52ee912bf0f61c2dc6896709fa6e408d5cb6acf70b2dcbdcdda1cb3f18"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.549309 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" event={"ID":"0d5e5981-45e4-4970-bff2-17a6087915e9","Type":"ContainerStarted","Data":"2e394652aecdb0cb849b3a87f5903a2cfceab4d4b8a685caa540a2bfe431a66b"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.560367 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" event={"ID":"f25b1c29-b400-4bd5-8e63-ac31629a0aa2","Type":"ContainerStarted","Data":"b691163b677c910509674f6db5d9ff696e93d9b243cff4172a82c0c83912a8be"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.560409 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" event={"ID":"f25b1c29-b400-4bd5-8e63-ac31629a0aa2","Type":"ContainerStarted","Data":"6c6b6c9401ac9dbd4343886936faef598dc029884f065d81df8c7afe5961220e"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.562701 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" event={"ID":"5693eddb-45a4-4cee-acb8-d3c0f23d16b8","Type":"ContainerStarted","Data":"c89ce2b280b428c740d5bd22287118dd56d4145a7504015a6bb3d6f0e5e982be"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.565371 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.566956 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.066941771 +0000 UTC m=+146.679292095 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.572582 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5rfs" podStartSLOduration=125.572566211 podStartE2EDuration="2m5.572566211s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.571431254 +0000 UTC m=+146.183781578" watchObservedRunningTime="2026-02-19 19:20:46.572566211 +0000 UTC m=+146.184916535" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.577707 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" event={"ID":"7c9da917-db10-4eba-bdff-f68354e8d4a6","Type":"ContainerStarted","Data":"caed6afe54d834d8d6a61929597731be14994252ce2dbaa2f9ca830772213232"} Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.577747 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.582632 4722 patch_prober.go:28] interesting pod/console-operator-58897d9998-h4zk8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.582663 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4gbkr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.582682 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-h4zk8" podUID="f00e2406-a55b-4e28-bed9-a060b0780301" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.582715 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" podUID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.583791 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-lg2rd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.583811 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lg2rd" podUID="b7b80c35-8f0b-4f44-af31-0b84ebddd4b8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.605390 4722 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-klvwp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.605436 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" podUID="7c9da917-db10-4eba-bdff-f68354e8d4a6" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.625595 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9sqrc" podStartSLOduration=125.625578189 podStartE2EDuration="2m5.625578189s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.622697627 +0000 UTC m=+146.235047951" watchObservedRunningTime="2026-02-19 19:20:46.625578189 +0000 UTC m=+146.237928513" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.672173 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.673781 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.173765203 +0000 UTC m=+146.786115527 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.682717 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-txlzt" podStartSLOduration=125.682703158 podStartE2EDuration="2m5.682703158s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.663524597 +0000 UTC m=+146.275874921" watchObservedRunningTime="2026-02-19 19:20:46.682703158 +0000 UTC m=+146.295053482" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.731202 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" podStartSLOduration=125.731180011 podStartE2EDuration="2m5.731180011s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.730852951 +0000 UTC m=+146.343203275" watchObservedRunningTime="2026-02-19 19:20:46.731180011 +0000 UTC m=+146.343530335" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.733335 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-8ppnm" podStartSLOduration=125.7333277 podStartE2EDuration="2m5.7333277s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.683858385 +0000 UTC m=+146.296208709" watchObservedRunningTime="2026-02-19 19:20:46.7333277 +0000 UTC m=+146.345678024" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.756702 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-4jhs8" podStartSLOduration=125.756683564 podStartE2EDuration="2m5.756683564s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.755400303 +0000 UTC m=+146.367750627" watchObservedRunningTime="2026-02-19 19:20:46.756683564 +0000 UTC m=+146.369033878" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.775636 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.775848 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.275832083 +0000 UTC m=+146.888182407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.775933 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.776230 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.276224446 +0000 UTC m=+146.888574770 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.781502 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9r7q" podStartSLOduration=125.781486743 podStartE2EDuration="2m5.781486743s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.772488038 +0000 UTC m=+146.384838372" watchObservedRunningTime="2026-02-19 19:20:46.781486743 +0000 UTC m=+146.393837067" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.842293 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-759tp" podStartSLOduration=125.84227218 podStartE2EDuration="2m5.84227218s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.808312668 +0000 UTC m=+146.420662992" watchObservedRunningTime="2026-02-19 19:20:46.84227218 +0000 UTC m=+146.454622504" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.843837 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h4kvb" podStartSLOduration=125.843829599 podStartE2EDuration="2m5.843829599s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.841259867 +0000 UTC m=+146.453610191" watchObservedRunningTime="2026-02-19 19:20:46.843829599 +0000 UTC m=+146.456179923" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.884523 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.885003 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.384987239 +0000 UTC m=+146.997337563 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.925675 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-72z7j" podStartSLOduration=125.925655975 podStartE2EDuration="2m5.925655975s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.892412746 +0000 UTC m=+146.504763060" watchObservedRunningTime="2026-02-19 19:20:46.925655975 +0000 UTC m=+146.538006299" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.927942 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vcmxn" podStartSLOduration=6.927934447 podStartE2EDuration="6.927934447s" podCreationTimestamp="2026-02-19 19:20:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.925527511 +0000 UTC m=+146.537877835" watchObservedRunningTime="2026-02-19 19:20:46.927934447 +0000 UTC m=+146.540284761" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.951033 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" podStartSLOduration=125.951017763 podStartE2EDuration="2m5.951017763s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:46.949631138 +0000 UTC m=+146.561981462" watchObservedRunningTime="2026-02-19 19:20:46.951017763 +0000 UTC m=+146.563368087" Feb 19 19:20:46 crc kubenswrapper[4722]: I0219 19:20:46.986531 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:46 crc kubenswrapper[4722]: E0219 19:20:46.986887 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.486875354 +0000 UTC m=+147.099225678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.011572 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-t6ljp" podStartSLOduration=126.0115507 podStartE2EDuration="2m6.0115507s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:47.010511388 +0000 UTC m=+146.622861712" watchObservedRunningTime="2026-02-19 19:20:47.0115507 +0000 UTC m=+146.623901024" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.069344 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvksz" podStartSLOduration=126.069330191 podStartE2EDuration="2m6.069330191s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:47.05486979 +0000 UTC m=+146.667220114" watchObservedRunningTime="2026-02-19 19:20:47.069330191 +0000 UTC m=+146.681680515" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.087540 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:47 crc kubenswrapper[4722]: E0219 19:20:47.087860 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.58784613 +0000 UTC m=+147.200196454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.120673 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" podStartSLOduration=126.120659624 podStartE2EDuration="2m6.120659624s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:47.119658393 +0000 UTC m=+146.732008717" watchObservedRunningTime="2026-02-19 19:20:47.120659624 +0000 UTC m=+146.733009948" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.152053 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" podStartSLOduration=126.152037084 podStartE2EDuration="2m6.152037084s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:47.148668826 +0000 UTC m=+146.761019150" watchObservedRunningTime="2026-02-19 19:20:47.152037084 +0000 UTC m=+146.764387408" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.188591 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:47 crc kubenswrapper[4722]: E0219 19:20:47.188949 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.688939159 +0000 UTC m=+147.301289483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.246672 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" podStartSLOduration=127.246632976 podStartE2EDuration="2m7.246632976s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:47.206920431 +0000 UTC m=+146.819270755" watchObservedRunningTime="2026-02-19 19:20:47.246632976 +0000 UTC m=+146.858983300" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.277528 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" podStartSLOduration=127.27751391 podStartE2EDuration="2m7.27751391s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:47.276631921 +0000 UTC m=+146.888982235" watchObservedRunningTime="2026-02-19 19:20:47.27751391 +0000 UTC m=+146.889864234" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.278161 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-pwpjg" podStartSLOduration=127.2781431 podStartE2EDuration="2m7.2781431s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:47.248751983 +0000 UTC m=+146.861102307" watchObservedRunningTime="2026-02-19 19:20:47.2781431 +0000 UTC m=+146.890493424" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.289346 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:47 crc kubenswrapper[4722]: E0219 19:20:47.289671 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.789656677 +0000 UTC m=+147.402007001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.363890 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.390536 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:47 crc kubenswrapper[4722]: E0219 19:20:47.390869 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.890858609 +0000 UTC m=+147.503208933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.486650 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:47 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:47 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:47 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.486726 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.491355 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:47 crc kubenswrapper[4722]: E0219 19:20:47.491713 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:47.991698271 +0000 UTC m=+147.604048595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.579779 4722 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-ndzb8 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.579835 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" podUID="ecc880c8-beb9-4081-8af6-64d2fa857901" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.582428 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" event={"ID":"bb502645-30c6-437d-abc3-28de80105939","Type":"ContainerStarted","Data":"60dcf0b9a4be5ab5a34b9f2bd3abcff72a4518eb236ec8de365ec62dad633e02"} Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.585025 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nm78h" event={"ID":"71548ff6-f831-48ba-af51-99fe431c447a","Type":"ContainerStarted","Data":"46422a26465d6e26c663457d9b147ecab7b8595a4e1a3b38e7524741f6b348d9"} Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.592215 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.592365 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-lg2rd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.592415 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lg2rd" podUID="b7b80c35-8f0b-4f44-af31-0b84ebddd4b8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.593661 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:47 crc kubenswrapper[4722]: E0219 19:20:47.593986 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.093974677 +0000 UTC m=+147.706325001 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.596410 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-cjtjp" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.625839 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.625887 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.630001 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nm78h" podStartSLOduration=7.629989344 podStartE2EDuration="7.629989344s" podCreationTimestamp="2026-02-19 19:20:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:47.629810478 +0000 UTC m=+147.242160812" watchObservedRunningTime="2026-02-19 19:20:47.629989344 +0000 UTC m=+147.242339668" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.657225 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-hwl66" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.694091 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:47 crc kubenswrapper[4722]: E0219 19:20:47.694296 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.19423801 +0000 UTC m=+147.806588334 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.695314 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:47 crc kubenswrapper[4722]: E0219 19:20:47.698250 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.198232898 +0000 UTC m=+147.810583262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.797382 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:47 crc kubenswrapper[4722]: E0219 19:20:47.797732 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.297716025 +0000 UTC m=+147.910066349 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.824239 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.824499 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.835264 4722 patch_prober.go:28] interesting pod/apiserver-76f77b778f-bg6mf container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.27:8443/livez\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.835311 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" podUID="8c255c5e-d6d9-4772-9151-0065df6dc00d" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.27:8443/livez\": dial tcp 10.217.0.27:8443: connect: connection refused" Feb 19 19:20:47 crc kubenswrapper[4722]: I0219 19:20:47.900060 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:47 crc kubenswrapper[4722]: E0219 19:20:47.900415 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.400400736 +0000 UTC m=+148.012751060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.000735 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.000916 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.500888656 +0000 UTC m=+148.113238990 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.001100 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.001473 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.501463064 +0000 UTC m=+148.113813458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.055418 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.102359 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.102566 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.602533392 +0000 UTC m=+148.214883726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.102901 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.103322 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.603311327 +0000 UTC m=+148.215661651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.204509 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.204714 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.704683966 +0000 UTC m=+148.317034300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.204867 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.205194 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.705182412 +0000 UTC m=+148.317532736 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.305917 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.306113 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.806086285 +0000 UTC m=+148.418436609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.306178 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.306469 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.806457417 +0000 UTC m=+148.418807741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.407685 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.407865 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.407899 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.407938 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.407960 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.408112 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:48.908088373 +0000 UTC m=+148.520438697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.409008 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.415264 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.418765 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.426779 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.492374 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:48 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:48 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:48 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.492429 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.508919 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.509313 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.009296826 +0000 UTC m=+148.621647150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.589365 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" event={"ID":"bb502645-30c6-437d-abc3-28de80105939","Type":"ContainerStarted","Data":"1c40fc8578525e5a466a0589c438b55ae6944e80d0dfa7a83da7efbd3e8cc78c"} Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.591960 4722 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-klvwp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.592006 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" podUID="7c9da917-db10-4eba-bdff-f68354e8d4a6" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.594801 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6mfpq" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.606859 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.609757 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.610022 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.110007723 +0000 UTC m=+148.722358037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.626756 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.634239 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.711364 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.711793 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.211778244 +0000 UTC m=+148.824128568 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.812651 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.812790 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.31277216 +0000 UTC m=+148.925122484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.812914 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.813223 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.313215875 +0000 UTC m=+148.925566189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.850705 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-vrqgd" Feb 19 19:20:48 crc kubenswrapper[4722]: I0219 19:20:48.915367 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:48 crc kubenswrapper[4722]: E0219 19:20:48.915630 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.415615335 +0000 UTC m=+149.027965659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.016843 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.017229 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.517212331 +0000 UTC m=+149.129562655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.119096 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.119440 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.619425945 +0000 UTC m=+149.231776269 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.221583 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.222275 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.722262431 +0000 UTC m=+149.334612755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.319780 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-64frs"] Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.320937 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.323790 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.325132 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.825088285 +0000 UTC m=+149.437438609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.325306 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.325633 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.825621832 +0000 UTC m=+149.437972156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.326193 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.336087 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-64frs"] Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.429138 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.429307 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-utilities\") pod \"community-operators-64frs\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.429355 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9szc5\" (UniqueName: \"kubernetes.io/projected/0c9d3632-a132-4377-95ef-564cffb1f299-kube-api-access-9szc5\") pod \"community-operators-64frs\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.429424 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-catalog-content\") pod \"community-operators-64frs\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.429515 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:49.929499959 +0000 UTC m=+149.541850283 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.488454 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:49 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:49 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:49 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.488504 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.521773 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6tp9x"] Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.523412 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.523891 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6tp9x"] Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.529093 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.534086 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-catalog-content\") pod \"community-operators-64frs\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.534121 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-utilities\") pod \"community-operators-64frs\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.534142 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.534275 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9szc5\" (UniqueName: \"kubernetes.io/projected/0c9d3632-a132-4377-95ef-564cffb1f299-kube-api-access-9szc5\") pod \"community-operators-64frs\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.534915 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-catalog-content\") pod \"community-operators-64frs\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.535125 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-utilities\") pod \"community-operators-64frs\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.535365 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:50.035354471 +0000 UTC m=+149.647704795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.547463 4722 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.568204 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9szc5\" (UniqueName: \"kubernetes.io/projected/0c9d3632-a132-4377-95ef-564cffb1f299-kube-api-access-9szc5\") pod \"community-operators-64frs\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.610629 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2bb4965fdf013ff81485111c60e1de5bff0cb3ec10055f3efa2e334f0dc3ab98"} Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.610674 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"aaca4df240278c17008fc36a63273475dea5aa9226711311ac6aa6f2839afb8a"} Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.613931 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ce36a6ef0ea013be2fca11d6b0e284251ec8550e4f4352ef961ee5bf851c6d00"} Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.613962 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"58d302fbb4956b3b899f9b13decd7899140aa1d4f3be3e194d3745a01343d0c1"} Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.614327 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.619352 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1415ed088f539e032c87a251625d406548f68ab5172a3fa9d829a6a5ae0f184c"} Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.619408 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8114d3a85535aeaa72f4582e9630b8967c980d11a0f614ba1c90371f598749ce"} Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.636319 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.636566 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvbkr\" (UniqueName: \"kubernetes.io/projected/396bbbdf-7f78-48e7-b02c-0737c221aaa6-kube-api-access-nvbkr\") pod \"certified-operators-6tp9x\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.636614 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-catalog-content\") pod \"certified-operators-6tp9x\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.636639 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-utilities\") pod \"certified-operators-6tp9x\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.636779 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" event={"ID":"bb502645-30c6-437d-abc3-28de80105939","Type":"ContainerStarted","Data":"9943299ced272b97fa61876ac6166d28f9833a1d6f5199b897797517c94c4426"} Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.636812 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" event={"ID":"bb502645-30c6-437d-abc3-28de80105939","Type":"ContainerStarted","Data":"4509d1d5ead6056c1f093d934c41d3443f253286e3693d084ef23edbe7ddc5d0"} Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.636875 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:50.136858873 +0000 UTC m=+149.749209197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.666396 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-64frs" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.700065 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-kqs9s" podStartSLOduration=9.700045476 podStartE2EDuration="9.700045476s" podCreationTimestamp="2026-02-19 19:20:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:49.697269097 +0000 UTC m=+149.309619421" watchObservedRunningTime="2026-02-19 19:20:49.700045476 +0000 UTC m=+149.312395800" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.725337 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j86kw"] Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.726362 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.737636 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.737916 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvbkr\" (UniqueName: \"kubernetes.io/projected/396bbbdf-7f78-48e7-b02c-0737c221aaa6-kube-api-access-nvbkr\") pod \"certified-operators-6tp9x\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.737964 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-catalog-content\") pod \"certified-operators-6tp9x\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.737982 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-utilities\") pod \"certified-operators-6tp9x\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.738722 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:50.238707887 +0000 UTC m=+149.851058211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.740932 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-catalog-content\") pod \"certified-operators-6tp9x\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.741252 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-utilities\") pod \"certified-operators-6tp9x\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.742310 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j86kw"] Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.800776 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvbkr\" (UniqueName: \"kubernetes.io/projected/396bbbdf-7f78-48e7-b02c-0737c221aaa6-kube-api-access-nvbkr\") pod \"certified-operators-6tp9x\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.840611 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.841971 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-utilities\") pod \"community-operators-j86kw\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.842115 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhxq8\" (UniqueName: \"kubernetes.io/projected/c594681e-de0b-4b39-98d3-573c9170c898-kube-api-access-bhxq8\") pod \"community-operators-j86kw\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.842300 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-catalog-content\") pod \"community-operators-j86kw\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.842363 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:50.342322506 +0000 UTC m=+149.954672830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.858401 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.934605 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p4576"] Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.935586 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.947243 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhxq8\" (UniqueName: \"kubernetes.io/projected/c594681e-de0b-4b39-98d3-573c9170c898-kube-api-access-bhxq8\") pod \"community-operators-j86kw\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.947303 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.947348 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-catalog-content\") pod \"community-operators-j86kw\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.947369 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-utilities\") pod \"community-operators-j86kw\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:49 crc kubenswrapper[4722]: E0219 19:20:49.949499 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:50.449482429 +0000 UTC m=+150.061832753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.951090 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-utilities\") pod \"community-operators-j86kw\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.952110 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-catalog-content\") pod \"community-operators-j86kw\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:49 crc kubenswrapper[4722]: I0219 19:20:49.989675 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4576"] Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.006904 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhxq8\" (UniqueName: \"kubernetes.io/projected/c594681e-de0b-4b39-98d3-573c9170c898-kube-api-access-bhxq8\") pod \"community-operators-j86kw\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.055770 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.055993 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-utilities\") pod \"certified-operators-p4576\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.056045 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vckt\" (UniqueName: \"kubernetes.io/projected/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-kube-api-access-6vckt\") pod \"certified-operators-p4576\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.056091 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-catalog-content\") pod \"certified-operators-p4576\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:50 crc kubenswrapper[4722]: E0219 19:20:50.056201 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 19:20:50.556186067 +0000 UTC m=+150.168536391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.060399 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.102466 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-64frs"] Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.157855 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-utilities\") pod \"certified-operators-p4576\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.157922 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vckt\" (UniqueName: \"kubernetes.io/projected/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-kube-api-access-6vckt\") pod \"certified-operators-p4576\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.157964 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.157984 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-catalog-content\") pod \"certified-operators-p4576\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.158421 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-catalog-content\") pod \"certified-operators-p4576\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.158681 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-utilities\") pod \"certified-operators-p4576\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:50 crc kubenswrapper[4722]: E0219 19:20:50.160000 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 19:20:50.659984493 +0000 UTC m=+150.272334817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k6bqq" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.182910 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vckt\" (UniqueName: \"kubernetes.io/projected/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-kube-api-access-6vckt\") pod \"certified-operators-p4576\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.245225 4722 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-19T19:20:49.547484837Z","Handler":null,"Name":""} Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.255668 4722 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.255705 4722 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.258640 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.282327 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.293447 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.334458 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6tp9x"] Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.366011 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.376931 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.376974 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.424262 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j86kw"] Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.487213 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k6bqq\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.492276 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:50 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:50 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:50 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.492319 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.606170 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p4576"] Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.643144 4722 generic.go:334] "Generic (PLEG): container finished" podID="0c9d3632-a132-4377-95ef-564cffb1f299" containerID="83c9ec76be9f3502d89c676d78e714eeea9b0340976175aeadfd0dc3726f4500" exitCode=0 Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.643253 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64frs" event={"ID":"0c9d3632-a132-4377-95ef-564cffb1f299","Type":"ContainerDied","Data":"83c9ec76be9f3502d89c676d78e714eeea9b0340976175aeadfd0dc3726f4500"} Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.643283 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64frs" event={"ID":"0c9d3632-a132-4377-95ef-564cffb1f299","Type":"ContainerStarted","Data":"d33d192f020b6508198a4a19887938ad42d94be353afef74a8413b4aa30e91d1"} Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.644601 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.644702 4722 generic.go:334] "Generic (PLEG): container finished" podID="c594681e-de0b-4b39-98d3-573c9170c898" containerID="83d63174a5dee0510e001a33beae280a6c56b7d09645762d8197fc6948f07c46" exitCode=0 Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.644760 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j86kw" event={"ID":"c594681e-de0b-4b39-98d3-573c9170c898","Type":"ContainerDied","Data":"83d63174a5dee0510e001a33beae280a6c56b7d09645762d8197fc6948f07c46"} Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.644843 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j86kw" event={"ID":"c594681e-de0b-4b39-98d3-573c9170c898","Type":"ContainerStarted","Data":"9b378dd4da61b5af99f5f93bba7c15d0d04355aa249d4e89b10b4d368ec3db4e"} Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.648129 4722 generic.go:334] "Generic (PLEG): container finished" podID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerID="4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6" exitCode=0 Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.648345 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tp9x" event={"ID":"396bbbdf-7f78-48e7-b02c-0737c221aaa6","Type":"ContainerDied","Data":"4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6"} Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.648384 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tp9x" event={"ID":"396bbbdf-7f78-48e7-b02c-0737c221aaa6","Type":"ContainerStarted","Data":"8c40a4539d5d6930a5a906cb44965a1810a1f2192dbfb01db14eeaf97f5cc6ee"} Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.676558 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.912224 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6bqq"] Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.948756 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.949385 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.951528 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.951645 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 19 19:20:50 crc kubenswrapper[4722]: I0219 19:20:50.961461 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.079598 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.087915 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/595d34f9-545d-47de-9a83-bd6210f4fe5e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"595d34f9-545d-47de-9a83-bd6210f4fe5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.088030 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/595d34f9-545d-47de-9a83-bd6210f4fe5e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"595d34f9-545d-47de-9a83-bd6210f4fe5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.188549 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/595d34f9-545d-47de-9a83-bd6210f4fe5e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"595d34f9-545d-47de-9a83-bd6210f4fe5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.188654 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/595d34f9-545d-47de-9a83-bd6210f4fe5e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"595d34f9-545d-47de-9a83-bd6210f4fe5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.188717 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/595d34f9-545d-47de-9a83-bd6210f4fe5e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"595d34f9-545d-47de-9a83-bd6210f4fe5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.214203 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/595d34f9-545d-47de-9a83-bd6210f4fe5e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"595d34f9-545d-47de-9a83-bd6210f4fe5e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.262686 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.477552 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.486333 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:51 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:51 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:51 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.486387 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.503591 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vqqrf"] Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.504558 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.512568 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.519257 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vqqrf"] Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.595360 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz4g5\" (UniqueName: \"kubernetes.io/projected/f10dae1c-d938-4cce-893b-4ad7eca7d23f-kube-api-access-fz4g5\") pod \"redhat-marketplace-vqqrf\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.595838 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-utilities\") pod \"redhat-marketplace-vqqrf\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.595872 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-catalog-content\") pod \"redhat-marketplace-vqqrf\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.660593 4722 generic.go:334] "Generic (PLEG): container finished" podID="0d5e5981-45e4-4970-bff2-17a6087915e9" containerID="2e394652aecdb0cb849b3a87f5903a2cfceab4d4b8a685caa540a2bfe431a66b" exitCode=0 Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.660653 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" event={"ID":"0d5e5981-45e4-4970-bff2-17a6087915e9","Type":"ContainerDied","Data":"2e394652aecdb0cb849b3a87f5903a2cfceab4d4b8a685caa540a2bfe431a66b"} Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.665447 4722 generic.go:334] "Generic (PLEG): container finished" podID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerID="d8152997987bda50dd12277fbfbc9da38a131bf85945cd167cb7db72d9b9372b" exitCode=0 Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.665552 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4576" event={"ID":"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe","Type":"ContainerDied","Data":"d8152997987bda50dd12277fbfbc9da38a131bf85945cd167cb7db72d9b9372b"} Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.665585 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4576" event={"ID":"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe","Type":"ContainerStarted","Data":"ff5ad27012e651ea99b2c5454cf7b789a1c44ed2c936a800e67aa01d7e7683b4"} Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.667517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"595d34f9-545d-47de-9a83-bd6210f4fe5e","Type":"ContainerStarted","Data":"1d9b1f4a4cce5c7d90fbde391391db7e28d94c0e18c38a34002436f351a36014"} Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.670434 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" event={"ID":"8d31d88d-2e34-4b55-b843-b8a67b957680","Type":"ContainerStarted","Data":"b11f548bfd17279778f42b1ce10841b0e20ec850d16175454f0810e6fc866fd8"} Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.670465 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" event={"ID":"8d31d88d-2e34-4b55-b843-b8a67b957680","Type":"ContainerStarted","Data":"7fc589c7d609f9f8ea97795796aadbb293f365ba97d8b385ba4c6ea2f33eb413"} Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.670926 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.696589 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-utilities\") pod \"redhat-marketplace-vqqrf\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.696642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-catalog-content\") pod \"redhat-marketplace-vqqrf\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.696711 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz4g5\" (UniqueName: \"kubernetes.io/projected/f10dae1c-d938-4cce-893b-4ad7eca7d23f-kube-api-access-fz4g5\") pod \"redhat-marketplace-vqqrf\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.697383 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-utilities\") pod \"redhat-marketplace-vqqrf\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.699476 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-catalog-content\") pod \"redhat-marketplace-vqqrf\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.718110 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" podStartSLOduration=130.718015948 podStartE2EDuration="2m10.718015948s" podCreationTimestamp="2026-02-19 19:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:20:51.715088985 +0000 UTC m=+151.327439309" watchObservedRunningTime="2026-02-19 19:20:51.718015948 +0000 UTC m=+151.330366272" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.734118 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz4g5\" (UniqueName: \"kubernetes.io/projected/f10dae1c-d938-4cce-893b-4ad7eca7d23f-kube-api-access-fz4g5\") pod \"redhat-marketplace-vqqrf\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.835180 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.901624 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hg6kw"] Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.902897 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:51 crc kubenswrapper[4722]: I0219 19:20:51.912877 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hg6kw"] Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.000781 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-utilities\") pod \"redhat-marketplace-hg6kw\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.001069 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-catalog-content\") pod \"redhat-marketplace-hg6kw\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.001099 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-665pv\" (UniqueName: \"kubernetes.io/projected/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-kube-api-access-665pv\") pod \"redhat-marketplace-hg6kw\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.102726 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-utilities\") pod \"redhat-marketplace-hg6kw\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.102833 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-catalog-content\") pod \"redhat-marketplace-hg6kw\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.102861 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-665pv\" (UniqueName: \"kubernetes.io/projected/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-kube-api-access-665pv\") pod \"redhat-marketplace-hg6kw\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.103241 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-utilities\") pod \"redhat-marketplace-hg6kw\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.103357 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-catalog-content\") pod \"redhat-marketplace-hg6kw\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.122658 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-665pv\" (UniqueName: \"kubernetes.io/projected/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-kube-api-access-665pv\") pod \"redhat-marketplace-hg6kw\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.193706 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vqqrf"] Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.230693 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.436330 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hg6kw"] Feb 19 19:20:52 crc kubenswrapper[4722]: W0219 19:20:52.444175 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ad9ab6b_efbe_4d01_97b0_281ee8a199df.slice/crio-8dc5a71e303cb93058a38469bccf8ecf609733633925d9394dad473ed82bd95d WatchSource:0}: Error finding container 8dc5a71e303cb93058a38469bccf8ecf609733633925d9394dad473ed82bd95d: Status 404 returned error can't find the container with id 8dc5a71e303cb93058a38469bccf8ecf609733633925d9394dad473ed82bd95d Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.486027 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:52 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:52 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:52 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.486089 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.502057 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rnljk"] Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.503586 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.506424 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.518521 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rnljk"] Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.610277 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-utilities\") pod \"redhat-operators-rnljk\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.610352 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbw2p\" (UniqueName: \"kubernetes.io/projected/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-kube-api-access-sbw2p\") pod \"redhat-operators-rnljk\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.610393 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-catalog-content\") pod \"redhat-operators-rnljk\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.681072 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.690658 4722 generic.go:334] "Generic (PLEG): container finished" podID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerID="0dd65a739e9f5e8ad490009cf2eebc6f6859f0fe25f4e418d1b7a49467014a17" exitCode=0 Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.690746 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hg6kw" event={"ID":"7ad9ab6b-efbe-4d01-97b0-281ee8a199df","Type":"ContainerDied","Data":"0dd65a739e9f5e8ad490009cf2eebc6f6859f0fe25f4e418d1b7a49467014a17"} Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.690777 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hg6kw" event={"ID":"7ad9ab6b-efbe-4d01-97b0-281ee8a199df","Type":"ContainerStarted","Data":"8dc5a71e303cb93058a38469bccf8ecf609733633925d9394dad473ed82bd95d"} Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.692948 4722 generic.go:334] "Generic (PLEG): container finished" podID="595d34f9-545d-47de-9a83-bd6210f4fe5e" containerID="b18b66df00aa2ddebb51af6a1a5323f2f0daccf9de4d9b58aaa55e91465e07a5" exitCode=0 Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.693036 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"595d34f9-545d-47de-9a83-bd6210f4fe5e","Type":"ContainerDied","Data":"b18b66df00aa2ddebb51af6a1a5323f2f0daccf9de4d9b58aaa55e91465e07a5"} Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.697440 4722 generic.go:334] "Generic (PLEG): container finished" podID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerID="b5c97b5b76e7afa24f8f93363368d20e4563b18ad7e8eaf0a0672fe76a243f0a" exitCode=0 Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.697493 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vqqrf" event={"ID":"f10dae1c-d938-4cce-893b-4ad7eca7d23f","Type":"ContainerDied","Data":"b5c97b5b76e7afa24f8f93363368d20e4563b18ad7e8eaf0a0672fe76a243f0a"} Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.697555 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vqqrf" event={"ID":"f10dae1c-d938-4cce-893b-4ad7eca7d23f","Type":"ContainerStarted","Data":"104233a8c5f814fc84e4081cc01af39a90044fcd055492fd733214b7e3b634d4"} Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.711588 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-utilities\") pod \"redhat-operators-rnljk\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.711644 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbw2p\" (UniqueName: \"kubernetes.io/projected/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-kube-api-access-sbw2p\") pod \"redhat-operators-rnljk\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.711692 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-catalog-content\") pod \"redhat-operators-rnljk\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.712443 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-catalog-content\") pod \"redhat-operators-rnljk\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.712693 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-utilities\") pod \"redhat-operators-rnljk\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.736842 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbw2p\" (UniqueName: \"kubernetes.io/projected/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-kube-api-access-sbw2p\") pod \"redhat-operators-rnljk\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.828047 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.835657 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-bg6mf" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.847058 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.847088 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-lg2rd container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.847087 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-lg2rd container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.847837 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lg2rd" podUID="b7b80c35-8f0b-4f44-af31-0b84ebddd4b8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.847181 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-lg2rd" podUID="b7b80c35-8f0b-4f44-af31-0b84ebddd4b8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.907162 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4tk99"] Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.908366 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.921425 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4tk99"] Feb 19 19:20:52 crc kubenswrapper[4722]: I0219 19:20:52.998576 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.017930 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.017982 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.018449 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b987n\" (UniqueName: \"kubernetes.io/projected/12054322-fe1e-4205-b6d3-05b30024a987-kube-api-access-b987n\") pod \"redhat-operators-4tk99\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.018498 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-utilities\") pod \"redhat-operators-4tk99\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.018546 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-catalog-content\") pod \"redhat-operators-4tk99\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.124550 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgsrz\" (UniqueName: \"kubernetes.io/projected/0d5e5981-45e4-4970-bff2-17a6087915e9-kube-api-access-xgsrz\") pod \"0d5e5981-45e4-4970-bff2-17a6087915e9\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.124844 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d5e5981-45e4-4970-bff2-17a6087915e9-config-volume\") pod \"0d5e5981-45e4-4970-bff2-17a6087915e9\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.124905 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d5e5981-45e4-4970-bff2-17a6087915e9-secret-volume\") pod \"0d5e5981-45e4-4970-bff2-17a6087915e9\" (UID: \"0d5e5981-45e4-4970-bff2-17a6087915e9\") " Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.125083 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-catalog-content\") pod \"redhat-operators-4tk99\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.125196 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b987n\" (UniqueName: \"kubernetes.io/projected/12054322-fe1e-4205-b6d3-05b30024a987-kube-api-access-b987n\") pod \"redhat-operators-4tk99\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.125288 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-utilities\") pod \"redhat-operators-4tk99\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.126630 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-utilities\") pod \"redhat-operators-4tk99\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.128812 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d5e5981-45e4-4970-bff2-17a6087915e9-config-volume" (OuterVolumeSpecName: "config-volume") pod "0d5e5981-45e4-4970-bff2-17a6087915e9" (UID: "0d5e5981-45e4-4970-bff2-17a6087915e9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.129154 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-catalog-content\") pod \"redhat-operators-4tk99\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.140314 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d5e5981-45e4-4970-bff2-17a6087915e9-kube-api-access-xgsrz" (OuterVolumeSpecName: "kube-api-access-xgsrz") pod "0d5e5981-45e4-4970-bff2-17a6087915e9" (UID: "0d5e5981-45e4-4970-bff2-17a6087915e9"). InnerVolumeSpecName "kube-api-access-xgsrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.140860 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d5e5981-45e4-4970-bff2-17a6087915e9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0d5e5981-45e4-4970-bff2-17a6087915e9" (UID: "0d5e5981-45e4-4970-bff2-17a6087915e9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.146126 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b987n\" (UniqueName: \"kubernetes.io/projected/12054322-fe1e-4205-b6d3-05b30024a987-kube-api-access-b987n\") pod \"redhat-operators-4tk99\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.226124 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d5e5981-45e4-4970-bff2-17a6087915e9-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.226167 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0d5e5981-45e4-4970-bff2-17a6087915e9-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.226193 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgsrz\" (UniqueName: \"kubernetes.io/projected/0d5e5981-45e4-4970-bff2-17a6087915e9-kube-api-access-xgsrz\") on node \"crc\" DevicePath \"\"" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.242117 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.246535 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rnljk"] Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.319330 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-klvwp" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.475828 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-h4zk8" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.482936 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.486956 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:53 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:53 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:53 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.487008 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.512068 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4tk99"] Feb 19 19:20:53 crc kubenswrapper[4722]: W0219 19:20:53.554768 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12054322_fe1e_4205_b6d3_05b30024a987.slice/crio-8be595dce110543e9226c30bd0042ab6bce6646475f3656901ee019b32be514b WatchSource:0}: Error finding container 8be595dce110543e9226c30bd0042ab6bce6646475f3656901ee019b32be514b: Status 404 returned error can't find the container with id 8be595dce110543e9226c30bd0042ab6bce6646475f3656901ee019b32be514b Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.708261 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4tk99" event={"ID":"12054322-fe1e-4205-b6d3-05b30024a987","Type":"ContainerStarted","Data":"8be595dce110543e9226c30bd0042ab6bce6646475f3656901ee019b32be514b"} Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.714510 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" event={"ID":"0d5e5981-45e4-4970-bff2-17a6087915e9","Type":"ContainerDied","Data":"262b347f2b9a906cc2a369ed3ff2e9b2acf60ad338b20154c8999adf62f8801a"} Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.714590 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="262b347f2b9a906cc2a369ed3ff2e9b2acf60ad338b20154c8999adf62f8801a" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.714676 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.720894 4722 generic.go:334] "Generic (PLEG): container finished" podID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerID="78d9b73635fb9fd918479e49197028103f67da7ed33002bbffe05da3a4ec4523" exitCode=0 Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.722350 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnljk" event={"ID":"2bb14baa-8bfc-415a-aa95-50b79f3c75ea","Type":"ContainerDied","Data":"78d9b73635fb9fd918479e49197028103f67da7ed33002bbffe05da3a4ec4523"} Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.722374 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnljk" event={"ID":"2bb14baa-8bfc-415a-aa95-50b79f3c75ea","Type":"ContainerStarted","Data":"1c1bf847d9c8bd6cdac4a8d78654087bcd70cd49df2904b71c207590aa5bdd28"} Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.762728 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.764010 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.764379 4722 patch_prober.go:28] interesting pod/console-f9d7485db-txlzt container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.764415 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-txlzt" podUID="187676b8-1029-4153-9da5-6614e9b7892e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 19 19:20:53 crc kubenswrapper[4722]: I0219 19:20:53.921457 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.040928 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/595d34f9-545d-47de-9a83-bd6210f4fe5e-kube-api-access\") pod \"595d34f9-545d-47de-9a83-bd6210f4fe5e\" (UID: \"595d34f9-545d-47de-9a83-bd6210f4fe5e\") " Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.041016 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/595d34f9-545d-47de-9a83-bd6210f4fe5e-kubelet-dir\") pod \"595d34f9-545d-47de-9a83-bd6210f4fe5e\" (UID: \"595d34f9-545d-47de-9a83-bd6210f4fe5e\") " Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.041413 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/595d34f9-545d-47de-9a83-bd6210f4fe5e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "595d34f9-545d-47de-9a83-bd6210f4fe5e" (UID: "595d34f9-545d-47de-9a83-bd6210f4fe5e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.046539 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/595d34f9-545d-47de-9a83-bd6210f4fe5e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "595d34f9-545d-47de-9a83-bd6210f4fe5e" (UID: "595d34f9-545d-47de-9a83-bd6210f4fe5e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.142630 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/595d34f9-545d-47de-9a83-bd6210f4fe5e-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.142661 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/595d34f9-545d-47de-9a83-bd6210f4fe5e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.490454 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:54 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:54 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:54 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.490537 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.730141 4722 generic.go:334] "Generic (PLEG): container finished" podID="12054322-fe1e-4205-b6d3-05b30024a987" containerID="57d551ccacbc04d55c2cac5a3bb7ceb078d63f2d275222bd8c776cbc6fad014d" exitCode=0 Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.730384 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4tk99" event={"ID":"12054322-fe1e-4205-b6d3-05b30024a987","Type":"ContainerDied","Data":"57d551ccacbc04d55c2cac5a3bb7ceb078d63f2d275222bd8c776cbc6fad014d"} Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.749351 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.752882 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"595d34f9-545d-47de-9a83-bd6210f4fe5e","Type":"ContainerDied","Data":"1d9b1f4a4cce5c7d90fbde391391db7e28d94c0e18c38a34002436f351a36014"} Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.752931 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d9b1f4a4cce5c7d90fbde391391db7e28d94c0e18c38a34002436f351a36014" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.871814 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 19:20:54 crc kubenswrapper[4722]: E0219 19:20:54.872404 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="595d34f9-545d-47de-9a83-bd6210f4fe5e" containerName="pruner" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.872424 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="595d34f9-545d-47de-9a83-bd6210f4fe5e" containerName="pruner" Feb 19 19:20:54 crc kubenswrapper[4722]: E0219 19:20:54.872440 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d5e5981-45e4-4970-bff2-17a6087915e9" containerName="collect-profiles" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.872448 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d5e5981-45e4-4970-bff2-17a6087915e9" containerName="collect-profiles" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.872599 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d5e5981-45e4-4970-bff2-17a6087915e9" containerName="collect-profiles" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.872616 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="595d34f9-545d-47de-9a83-bd6210f4fe5e" containerName="pruner" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.873055 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.874968 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.875158 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.882387 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.957294 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40cd9afa-751d-46e0-b482-2098a89d2840-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"40cd9afa-751d-46e0-b482-2098a89d2840\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 19:20:54 crc kubenswrapper[4722]: I0219 19:20:54.957410 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40cd9afa-751d-46e0-b482-2098a89d2840-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"40cd9afa-751d-46e0-b482-2098a89d2840\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 19:20:55 crc kubenswrapper[4722]: I0219 19:20:55.058973 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40cd9afa-751d-46e0-b482-2098a89d2840-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"40cd9afa-751d-46e0-b482-2098a89d2840\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 19:20:55 crc kubenswrapper[4722]: I0219 19:20:55.059025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40cd9afa-751d-46e0-b482-2098a89d2840-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"40cd9afa-751d-46e0-b482-2098a89d2840\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 19:20:55 crc kubenswrapper[4722]: I0219 19:20:55.059431 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40cd9afa-751d-46e0-b482-2098a89d2840-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"40cd9afa-751d-46e0-b482-2098a89d2840\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 19:20:55 crc kubenswrapper[4722]: I0219 19:20:55.078299 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40cd9afa-751d-46e0-b482-2098a89d2840-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"40cd9afa-751d-46e0-b482-2098a89d2840\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 19:20:55 crc kubenswrapper[4722]: I0219 19:20:55.250936 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 19:20:55 crc kubenswrapper[4722]: I0219 19:20:55.487206 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:55 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:55 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:55 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:55 crc kubenswrapper[4722]: I0219 19:20:55.487616 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:55 crc kubenswrapper[4722]: I0219 19:20:55.601716 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 19:20:55 crc kubenswrapper[4722]: W0219 19:20:55.616268 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod40cd9afa_751d_46e0_b482_2098a89d2840.slice/crio-6045a875ab659eeb8a6a66384ccaca72612b516ca81cb0bc48e31d836f83ab80 WatchSource:0}: Error finding container 6045a875ab659eeb8a6a66384ccaca72612b516ca81cb0bc48e31d836f83ab80: Status 404 returned error can't find the container with id 6045a875ab659eeb8a6a66384ccaca72612b516ca81cb0bc48e31d836f83ab80 Feb 19 19:20:55 crc kubenswrapper[4722]: I0219 19:20:55.759976 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"40cd9afa-751d-46e0-b482-2098a89d2840","Type":"ContainerStarted","Data":"6045a875ab659eeb8a6a66384ccaca72612b516ca81cb0bc48e31d836f83ab80"} Feb 19 19:20:55 crc kubenswrapper[4722]: I0219 19:20:55.945535 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nm78h" Feb 19 19:20:56 crc kubenswrapper[4722]: I0219 19:20:56.487166 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:56 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:56 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:56 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:56 crc kubenswrapper[4722]: I0219 19:20:56.487225 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:56 crc kubenswrapper[4722]: I0219 19:20:56.780298 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"40cd9afa-751d-46e0-b482-2098a89d2840","Type":"ContainerStarted","Data":"e80b5b86c31a5449e059431e415e898753f7e3e206ec49bc3aceea682dd84694"} Feb 19 19:20:57 crc kubenswrapper[4722]: I0219 19:20:57.485747 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:57 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:57 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:57 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:57 crc kubenswrapper[4722]: I0219 19:20:57.485801 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:57 crc kubenswrapper[4722]: I0219 19:20:57.794612 4722 generic.go:334] "Generic (PLEG): container finished" podID="40cd9afa-751d-46e0-b482-2098a89d2840" containerID="e80b5b86c31a5449e059431e415e898753f7e3e206ec49bc3aceea682dd84694" exitCode=0 Feb 19 19:20:57 crc kubenswrapper[4722]: I0219 19:20:57.794660 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"40cd9afa-751d-46e0-b482-2098a89d2840","Type":"ContainerDied","Data":"e80b5b86c31a5449e059431e415e898753f7e3e206ec49bc3aceea682dd84694"} Feb 19 19:20:58 crc kubenswrapper[4722]: I0219 19:20:58.484847 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:58 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:58 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:58 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:58 crc kubenswrapper[4722]: I0219 19:20:58.484920 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:20:59 crc kubenswrapper[4722]: I0219 19:20:59.485281 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:20:59 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:20:59 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:20:59 crc kubenswrapper[4722]: healthz check failed Feb 19 19:20:59 crc kubenswrapper[4722]: I0219 19:20:59.485946 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:21:00 crc kubenswrapper[4722]: I0219 19:21:00.486166 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:21:00 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:21:00 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:21:00 crc kubenswrapper[4722]: healthz check failed Feb 19 19:21:00 crc kubenswrapper[4722]: I0219 19:21:00.486432 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:21:01 crc kubenswrapper[4722]: I0219 19:21:01.486222 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:21:01 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:21:01 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:21:01 crc kubenswrapper[4722]: healthz check failed Feb 19 19:21:01 crc kubenswrapper[4722]: I0219 19:21:01.486302 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:21:02 crc kubenswrapper[4722]: I0219 19:21:02.484803 4722 patch_prober.go:28] interesting pod/router-default-5444994796-nzgmv container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 19:21:02 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 19 19:21:02 crc kubenswrapper[4722]: [+]process-running ok Feb 19 19:21:02 crc kubenswrapper[4722]: healthz check failed Feb 19 19:21:02 crc kubenswrapper[4722]: I0219 19:21:02.485130 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nzgmv" podUID="3071e162-d262-4732-81ca-10bb9b507321" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 19:21:02 crc kubenswrapper[4722]: I0219 19:21:02.849834 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-lg2rd" Feb 19 19:21:03 crc kubenswrapper[4722]: I0219 19:21:03.207680 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:21:03 crc kubenswrapper[4722]: I0219 19:21:03.215860 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/493acad5-7300-4941-9311-19b3d5f21786-metrics-certs\") pod \"network-metrics-daemon-s6hhp\" (UID: \"493acad5-7300-4941-9311-19b3d5f21786\") " pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:21:03 crc kubenswrapper[4722]: I0219 19:21:03.292784 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-s6hhp" Feb 19 19:21:03 crc kubenswrapper[4722]: I0219 19:21:03.500588 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:21:03 crc kubenswrapper[4722]: I0219 19:21:03.503216 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-nzgmv" Feb 19 19:21:03 crc kubenswrapper[4722]: I0219 19:21:03.763181 4722 patch_prober.go:28] interesting pod/console-f9d7485db-txlzt container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 19 19:21:03 crc kubenswrapper[4722]: I0219 19:21:03.763235 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-txlzt" podUID="187676b8-1029-4153-9da5-6614e9b7892e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 19 19:21:05 crc kubenswrapper[4722]: I0219 19:21:05.740700 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 19:21:05 crc kubenswrapper[4722]: I0219 19:21:05.841769 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40cd9afa-751d-46e0-b482-2098a89d2840-kube-api-access\") pod \"40cd9afa-751d-46e0-b482-2098a89d2840\" (UID: \"40cd9afa-751d-46e0-b482-2098a89d2840\") " Feb 19 19:21:05 crc kubenswrapper[4722]: I0219 19:21:05.841830 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40cd9afa-751d-46e0-b482-2098a89d2840-kubelet-dir\") pod \"40cd9afa-751d-46e0-b482-2098a89d2840\" (UID: \"40cd9afa-751d-46e0-b482-2098a89d2840\") " Feb 19 19:21:05 crc kubenswrapper[4722]: I0219 19:21:05.841966 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40cd9afa-751d-46e0-b482-2098a89d2840-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "40cd9afa-751d-46e0-b482-2098a89d2840" (UID: "40cd9afa-751d-46e0-b482-2098a89d2840"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:21:05 crc kubenswrapper[4722]: I0219 19:21:05.842311 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/40cd9afa-751d-46e0-b482-2098a89d2840-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:05 crc kubenswrapper[4722]: I0219 19:21:05.847274 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40cd9afa-751d-46e0-b482-2098a89d2840-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "40cd9afa-751d-46e0-b482-2098a89d2840" (UID: "40cd9afa-751d-46e0-b482-2098a89d2840"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:05 crc kubenswrapper[4722]: I0219 19:21:05.943423 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40cd9afa-751d-46e0-b482-2098a89d2840-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:06 crc kubenswrapper[4722]: I0219 19:21:06.148358 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"40cd9afa-751d-46e0-b482-2098a89d2840","Type":"ContainerDied","Data":"6045a875ab659eeb8a6a66384ccaca72612b516ca81cb0bc48e31d836f83ab80"} Feb 19 19:21:06 crc kubenswrapper[4722]: I0219 19:21:06.148407 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6045a875ab659eeb8a6a66384ccaca72612b516ca81cb0bc48e31d836f83ab80" Feb 19 19:21:06 crc kubenswrapper[4722]: I0219 19:21:06.148438 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 19:21:09 crc kubenswrapper[4722]: I0219 19:21:09.471057 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xn22j"] Feb 19 19:21:09 crc kubenswrapper[4722]: I0219 19:21:09.471981 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" podUID="4ded6995-db61-4962-a375-ba80816b8df9" containerName="controller-manager" containerID="cri-o://0fdf4a7637cb5402705fa920589e29808535eef70605f1728816ba11c57d64e5" gracePeriod=30 Feb 19 19:21:09 crc kubenswrapper[4722]: I0219 19:21:09.480252 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk"] Feb 19 19:21:09 crc kubenswrapper[4722]: I0219 19:21:09.480540 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" podUID="c1782da0-924a-481b-b0fc-20050e168591" containerName="route-controller-manager" containerID="cri-o://c3f6cf9c254cddfd544511ce0603d2e13a0cf98656ff97b926bac52ca75ade34" gracePeriod=30 Feb 19 19:21:10 crc kubenswrapper[4722]: I0219 19:21:10.686956 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:21:11 crc kubenswrapper[4722]: I0219 19:21:11.798525 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:21:11 crc kubenswrapper[4722]: I0219 19:21:11.798595 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:21:12 crc kubenswrapper[4722]: I0219 19:21:12.835752 4722 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-hj8tk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Feb 19 19:21:12 crc kubenswrapper[4722]: I0219 19:21:12.836212 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" podUID="c1782da0-924a-481b-b0fc-20050e168591" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Feb 19 19:21:13 crc kubenswrapper[4722]: I0219 19:21:13.675103 4722 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xn22j container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 19:21:13 crc kubenswrapper[4722]: I0219 19:21:13.675190 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" podUID="4ded6995-db61-4962-a375-ba80816b8df9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 19:21:13 crc kubenswrapper[4722]: I0219 19:21:13.769947 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:21:13 crc kubenswrapper[4722]: I0219 19:21:13.776538 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:21:16 crc kubenswrapper[4722]: I0219 19:21:16.214455 4722 generic.go:334] "Generic (PLEG): container finished" podID="c1782da0-924a-481b-b0fc-20050e168591" containerID="c3f6cf9c254cddfd544511ce0603d2e13a0cf98656ff97b926bac52ca75ade34" exitCode=0 Feb 19 19:21:16 crc kubenswrapper[4722]: I0219 19:21:16.214586 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" event={"ID":"c1782da0-924a-481b-b0fc-20050e168591","Type":"ContainerDied","Data":"c3f6cf9c254cddfd544511ce0603d2e13a0cf98656ff97b926bac52ca75ade34"} Feb 19 19:21:16 crc kubenswrapper[4722]: I0219 19:21:16.216405 4722 generic.go:334] "Generic (PLEG): container finished" podID="4ded6995-db61-4962-a375-ba80816b8df9" containerID="0fdf4a7637cb5402705fa920589e29808535eef70605f1728816ba11c57d64e5" exitCode=0 Feb 19 19:21:16 crc kubenswrapper[4722]: I0219 19:21:16.216441 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" event={"ID":"4ded6995-db61-4962-a375-ba80816b8df9","Type":"ContainerDied","Data":"0fdf4a7637cb5402705fa920589e29808535eef70605f1728816ba11c57d64e5"} Feb 19 19:21:22 crc kubenswrapper[4722]: E0219 19:21:22.130768 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 19 19:21:22 crc kubenswrapper[4722]: E0219 19:21:22.131633 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fz4g5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vqqrf_openshift-marketplace(f10dae1c-d938-4cce-893b-4ad7eca7d23f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:21:22 crc kubenswrapper[4722]: E0219 19:21:22.132855 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vqqrf" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" Feb 19 19:21:22 crc kubenswrapper[4722]: I0219 19:21:22.836029 4722 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-hj8tk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Feb 19 19:21:22 crc kubenswrapper[4722]: I0219 19:21:22.836083 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" podUID="c1782da0-924a-481b-b0fc-20050e168591" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Feb 19 19:21:23 crc kubenswrapper[4722]: E0219 19:21:23.518595 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-vqqrf" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" Feb 19 19:21:23 crc kubenswrapper[4722]: E0219 19:21:23.575063 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 19 19:21:23 crc kubenswrapper[4722]: E0219 19:21:23.575256 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bhxq8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-j86kw_openshift-marketplace(c594681e-de0b-4b39-98d3-573c9170c898): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:21:23 crc kubenswrapper[4722]: E0219 19:21:23.576660 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-j86kw" podUID="c594681e-de0b-4b39-98d3-573c9170c898" Feb 19 19:21:23 crc kubenswrapper[4722]: I0219 19:21:23.675696 4722 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xn22j container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 19:21:23 crc kubenswrapper[4722]: I0219 19:21:23.675811 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" podUID="4ded6995-db61-4962-a375-ba80816b8df9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 19:21:23 crc kubenswrapper[4722]: I0219 19:21:23.883448 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vh5vl" Feb 19 19:21:24 crc kubenswrapper[4722]: E0219 19:21:24.879876 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-j86kw" podUID="c594681e-de0b-4b39-98d3-573c9170c898" Feb 19 19:21:24 crc kubenswrapper[4722]: I0219 19:21:24.941009 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:21:24 crc kubenswrapper[4722]: E0219 19:21:24.970924 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 19 19:21:24 crc kubenswrapper[4722]: E0219 19:21:24.971218 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6vckt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-p4576_openshift-marketplace(f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:21:24 crc kubenswrapper[4722]: E0219 19:21:24.973226 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-p4576" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" Feb 19 19:21:24 crc kubenswrapper[4722]: I0219 19:21:24.975041 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c59b95f4-rjjbc"] Feb 19 19:21:24 crc kubenswrapper[4722]: E0219 19:21:24.975396 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40cd9afa-751d-46e0-b482-2098a89d2840" containerName="pruner" Feb 19 19:21:24 crc kubenswrapper[4722]: I0219 19:21:24.975411 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cd9afa-751d-46e0-b482-2098a89d2840" containerName="pruner" Feb 19 19:21:24 crc kubenswrapper[4722]: E0219 19:21:24.975428 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ded6995-db61-4962-a375-ba80816b8df9" containerName="controller-manager" Feb 19 19:21:24 crc kubenswrapper[4722]: I0219 19:21:24.975439 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ded6995-db61-4962-a375-ba80816b8df9" containerName="controller-manager" Feb 19 19:21:24 crc kubenswrapper[4722]: I0219 19:21:24.975636 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="40cd9afa-751d-46e0-b482-2098a89d2840" containerName="pruner" Feb 19 19:21:24 crc kubenswrapper[4722]: I0219 19:21:24.975680 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ded6995-db61-4962-a375-ba80816b8df9" containerName="controller-manager" Feb 19 19:21:24 crc kubenswrapper[4722]: I0219 19:21:24.976255 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:24 crc kubenswrapper[4722]: I0219 19:21:24.983026 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c59b95f4-rjjbc"] Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.033191 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.033335 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nvbkr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-6tp9x_openshift-marketplace(396bbbdf-7f78-48e7-b02c-0737c221aaa6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.034601 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-config\") pod \"4ded6995-db61-4962-a375-ba80816b8df9\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.034636 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-proxy-ca-bundles\") pod \"4ded6995-db61-4962-a375-ba80816b8df9\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.034716 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-6tp9x" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.034801 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ded6995-db61-4962-a375-ba80816b8df9-serving-cert\") pod \"4ded6995-db61-4962-a375-ba80816b8df9\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.034868 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzcqk\" (UniqueName: \"kubernetes.io/projected/4ded6995-db61-4962-a375-ba80816b8df9-kube-api-access-lzcqk\") pod \"4ded6995-db61-4962-a375-ba80816b8df9\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.034953 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-client-ca\") pod \"4ded6995-db61-4962-a375-ba80816b8df9\" (UID: \"4ded6995-db61-4962-a375-ba80816b8df9\") " Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.035823 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4ded6995-db61-4962-a375-ba80816b8df9" (UID: "4ded6995-db61-4962-a375-ba80816b8df9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.036103 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-client-ca" (OuterVolumeSpecName: "client-ca") pod "4ded6995-db61-4962-a375-ba80816b8df9" (UID: "4ded6995-db61-4962-a375-ba80816b8df9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.036161 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-config" (OuterVolumeSpecName: "config") pod "4ded6995-db61-4962-a375-ba80816b8df9" (UID: "4ded6995-db61-4962-a375-ba80816b8df9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.041993 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ded6995-db61-4962-a375-ba80816b8df9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4ded6995-db61-4962-a375-ba80816b8df9" (UID: "4ded6995-db61-4962-a375-ba80816b8df9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.043119 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ded6995-db61-4962-a375-ba80816b8df9-kube-api-access-lzcqk" (OuterVolumeSpecName: "kube-api-access-lzcqk") pod "4ded6995-db61-4962-a375-ba80816b8df9" (UID: "4ded6995-db61-4962-a375-ba80816b8df9"). InnerVolumeSpecName "kube-api-access-lzcqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.072194 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.072306 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-665pv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hg6kw_openshift-marketplace(7ad9ab6b-efbe-4d01-97b0-281ee8a199df): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.074031 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-hg6kw" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.105802 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.105923 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9szc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-64frs_openshift-marketplace(0c9d3632-a132-4377-95ef-564cffb1f299): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.107331 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-64frs" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.138867 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-proxy-ca-bundles\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.138938 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-client-ca\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.139011 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8ldd\" (UniqueName: \"kubernetes.io/projected/c0b15088-b052-4d3f-adca-61ff969d0699-kube-api-access-l8ldd\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.139039 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-config\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.139058 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0b15088-b052-4d3f-adca-61ff969d0699-serving-cert\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.139099 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.139110 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.139119 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ded6995-db61-4962-a375-ba80816b8df9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.139128 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ded6995-db61-4962-a375-ba80816b8df9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.139137 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzcqk\" (UniqueName: \"kubernetes.io/projected/4ded6995-db61-4962-a375-ba80816b8df9-kube-api-access-lzcqk\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.140914 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.155328 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-s6hhp"] Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.192091 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.192269 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b987n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-4tk99_openshift-marketplace(12054322-fe1e-4205-b6d3-05b30024a987): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.193436 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-4tk99" podUID="12054322-fe1e-4205-b6d3-05b30024a987" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.240201 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54srr\" (UniqueName: \"kubernetes.io/projected/c1782da0-924a-481b-b0fc-20050e168591-kube-api-access-54srr\") pod \"c1782da0-924a-481b-b0fc-20050e168591\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.240476 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-config\") pod \"c1782da0-924a-481b-b0fc-20050e168591\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.240597 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1782da0-924a-481b-b0fc-20050e168591-serving-cert\") pod \"c1782da0-924a-481b-b0fc-20050e168591\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.240633 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-client-ca\") pod \"c1782da0-924a-481b-b0fc-20050e168591\" (UID: \"c1782da0-924a-481b-b0fc-20050e168591\") " Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.240852 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8ldd\" (UniqueName: \"kubernetes.io/projected/c0b15088-b052-4d3f-adca-61ff969d0699-kube-api-access-l8ldd\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.240904 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-config\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.240929 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0b15088-b052-4d3f-adca-61ff969d0699-serving-cert\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.240953 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-proxy-ca-bundles\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.241001 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-client-ca\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.241336 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-client-ca" (OuterVolumeSpecName: "client-ca") pod "c1782da0-924a-481b-b0fc-20050e168591" (UID: "c1782da0-924a-481b-b0fc-20050e168591"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.241391 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-config" (OuterVolumeSpecName: "config") pod "c1782da0-924a-481b-b0fc-20050e168591" (UID: "c1782da0-924a-481b-b0fc-20050e168591"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.242196 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-client-ca\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.242641 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-proxy-ca-bundles\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.243592 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-config\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.246044 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1782da0-924a-481b-b0fc-20050e168591-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c1782da0-924a-481b-b0fc-20050e168591" (UID: "c1782da0-924a-481b-b0fc-20050e168591"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.246277 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0b15088-b052-4d3f-adca-61ff969d0699-serving-cert\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.250847 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1782da0-924a-481b-b0fc-20050e168591-kube-api-access-54srr" (OuterVolumeSpecName: "kube-api-access-54srr") pod "c1782da0-924a-481b-b0fc-20050e168591" (UID: "c1782da0-924a-481b-b0fc-20050e168591"). InnerVolumeSpecName "kube-api-access-54srr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.258257 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8ldd\" (UniqueName: \"kubernetes.io/projected/c0b15088-b052-4d3f-adca-61ff969d0699-kube-api-access-l8ldd\") pod \"controller-manager-6c59b95f4-rjjbc\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.280782 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" event={"ID":"c1782da0-924a-481b-b0fc-20050e168591","Type":"ContainerDied","Data":"91e29c4c51cd956e7890c0dbe940cd28aaff5babb9d72cd9fb735cea262c06b2"} Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.280857 4722 scope.go:117] "RemoveContainer" containerID="c3f6cf9c254cddfd544511ce0603d2e13a0cf98656ff97b926bac52ca75ade34" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.281083 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.283056 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" event={"ID":"4ded6995-db61-4962-a375-ba80816b8df9","Type":"ContainerDied","Data":"6e1b8dc29249f786b414083b626373283ac9d3f4f6727c121afc4a975d983b31"} Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.283196 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xn22j" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.284928 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" event={"ID":"493acad5-7300-4941-9311-19b3d5f21786","Type":"ContainerStarted","Data":"90a81819bec14e2cc6ec1baaf5df2e5daf052397719f77199e24b15492b6f23a"} Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.287534 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnljk" event={"ID":"2bb14baa-8bfc-415a-aa95-50b79f3c75ea","Type":"ContainerStarted","Data":"a2f518a60109d1ac4178243c5d97f899b29c7b0af31605dc637805b2a245c236"} Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.288793 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-6tp9x" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.288967 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hg6kw" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.289812 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-64frs" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.290128 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-p4576" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" Feb 19 19:21:25 crc kubenswrapper[4722]: E0219 19:21:25.290412 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-4tk99" podUID="12054322-fe1e-4205-b6d3-05b30024a987" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.296541 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.301785 4722 scope.go:117] "RemoveContainer" containerID="0fdf4a7637cb5402705fa920589e29808535eef70605f1728816ba11c57d64e5" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.340676 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xn22j"] Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.341949 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1782da0-924a-481b-b0fc-20050e168591-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.341984 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.342003 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54srr\" (UniqueName: \"kubernetes.io/projected/c1782da0-924a-481b-b0fc-20050e168591-kube-api-access-54srr\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.342023 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1782da0-924a-481b-b0fc-20050e168591-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.344544 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xn22j"] Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.425994 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk"] Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.431585 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-hj8tk"] Feb 19 19:21:25 crc kubenswrapper[4722]: I0219 19:21:25.530556 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c59b95f4-rjjbc"] Feb 19 19:21:25 crc kubenswrapper[4722]: W0219 19:21:25.541669 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0b15088_b052_4d3f_adca_61ff969d0699.slice/crio-ed1914331bde3da0380e958185e797f7bfc00f39c4f87c91874487af65b25caf WatchSource:0}: Error finding container ed1914331bde3da0380e958185e797f7bfc00f39c4f87c91874487af65b25caf: Status 404 returned error can't find the container with id ed1914331bde3da0380e958185e797f7bfc00f39c4f87c91874487af65b25caf Feb 19 19:21:26 crc kubenswrapper[4722]: I0219 19:21:26.296324 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" event={"ID":"c0b15088-b052-4d3f-adca-61ff969d0699","Type":"ContainerStarted","Data":"c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89"} Feb 19 19:21:26 crc kubenswrapper[4722]: I0219 19:21:26.296778 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" event={"ID":"c0b15088-b052-4d3f-adca-61ff969d0699","Type":"ContainerStarted","Data":"ed1914331bde3da0380e958185e797f7bfc00f39c4f87c91874487af65b25caf"} Feb 19 19:21:26 crc kubenswrapper[4722]: I0219 19:21:26.296801 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:26 crc kubenswrapper[4722]: I0219 19:21:26.298048 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" event={"ID":"493acad5-7300-4941-9311-19b3d5f21786","Type":"ContainerStarted","Data":"dd1d6ce5b730c6775283e4a7a31924f3ded3072999fc007734ab62952de32159"} Feb 19 19:21:26 crc kubenswrapper[4722]: I0219 19:21:26.298075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-s6hhp" event={"ID":"493acad5-7300-4941-9311-19b3d5f21786","Type":"ContainerStarted","Data":"770592151c0712bb70350497af14f58fe44588b5ccfe02c05ab0268cd96a68f6"} Feb 19 19:21:26 crc kubenswrapper[4722]: I0219 19:21:26.300340 4722 generic.go:334] "Generic (PLEG): container finished" podID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerID="a2f518a60109d1ac4178243c5d97f899b29c7b0af31605dc637805b2a245c236" exitCode=0 Feb 19 19:21:26 crc kubenswrapper[4722]: I0219 19:21:26.300394 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnljk" event={"ID":"2bb14baa-8bfc-415a-aa95-50b79f3c75ea","Type":"ContainerDied","Data":"a2f518a60109d1ac4178243c5d97f899b29c7b0af31605dc637805b2a245c236"} Feb 19 19:21:26 crc kubenswrapper[4722]: I0219 19:21:26.304070 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:26 crc kubenswrapper[4722]: I0219 19:21:26.323320 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" podStartSLOduration=17.323300519 podStartE2EDuration="17.323300519s" podCreationTimestamp="2026-02-19 19:21:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:21:26.321383778 +0000 UTC m=+185.933734102" watchObservedRunningTime="2026-02-19 19:21:26.323300519 +0000 UTC m=+185.935650843" Feb 19 19:21:26 crc kubenswrapper[4722]: I0219 19:21:26.362218 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-s6hhp" podStartSLOduration=166.362183457 podStartE2EDuration="2m46.362183457s" podCreationTimestamp="2026-02-19 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:21:26.359599355 +0000 UTC m=+185.971949699" watchObservedRunningTime="2026-02-19 19:21:26.362183457 +0000 UTC m=+185.974533781" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.080345 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ded6995-db61-4962-a375-ba80816b8df9" path="/var/lib/kubelet/pods/4ded6995-db61-4962-a375-ba80816b8df9/volumes" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.081350 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1782da0-924a-481b-b0fc-20050e168591" path="/var/lib/kubelet/pods/c1782da0-924a-481b-b0fc-20050e168591/volumes" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.308818 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnljk" event={"ID":"2bb14baa-8bfc-415a-aa95-50b79f3c75ea","Type":"ContainerStarted","Data":"46fb6dc449baf9d204637234c7660e38bd2e8d2f352111d61b07600262a339ee"} Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.331810 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rnljk" podStartSLOduration=2.26319013 podStartE2EDuration="35.331793706s" podCreationTimestamp="2026-02-19 19:20:52 +0000 UTC" firstStartedPulling="2026-02-19 19:20:53.724429612 +0000 UTC m=+153.336779936" lastFinishedPulling="2026-02-19 19:21:26.793033188 +0000 UTC m=+186.405383512" observedRunningTime="2026-02-19 19:21:27.328742779 +0000 UTC m=+186.941093113" watchObservedRunningTime="2026-02-19 19:21:27.331793706 +0000 UTC m=+186.944144030" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.541564 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g"] Feb 19 19:21:27 crc kubenswrapper[4722]: E0219 19:21:27.541892 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1782da0-924a-481b-b0fc-20050e168591" containerName="route-controller-manager" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.541915 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1782da0-924a-481b-b0fc-20050e168591" containerName="route-controller-manager" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.542087 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1782da0-924a-481b-b0fc-20050e168591" containerName="route-controller-manager" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.542779 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.545582 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.545582 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.547123 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.547450 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.547486 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.549297 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.550873 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g"] Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.672622 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8zpn\" (UniqueName: \"kubernetes.io/projected/c1e07a33-6b17-400a-9697-f6746b257c3b-kube-api-access-m8zpn\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.672681 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-client-ca\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.672724 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1e07a33-6b17-400a-9697-f6746b257c3b-serving-cert\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.673023 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-config\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.774747 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-config\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.775160 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8zpn\" (UniqueName: \"kubernetes.io/projected/c1e07a33-6b17-400a-9697-f6746b257c3b-kube-api-access-m8zpn\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.775205 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-client-ca\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.775233 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1e07a33-6b17-400a-9697-f6746b257c3b-serving-cert\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.776277 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-config\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.777030 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-client-ca\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.782435 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1e07a33-6b17-400a-9697-f6746b257c3b-serving-cert\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.795826 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8zpn\" (UniqueName: \"kubernetes.io/projected/c1e07a33-6b17-400a-9697-f6746b257c3b-kube-api-access-m8zpn\") pod \"route-controller-manager-8d8fd67f4-czb9g\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:27 crc kubenswrapper[4722]: I0219 19:21:27.869746 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:28 crc kubenswrapper[4722]: I0219 19:21:28.343128 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g"] Feb 19 19:21:28 crc kubenswrapper[4722]: I0219 19:21:28.631611 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 19:21:29 crc kubenswrapper[4722]: I0219 19:21:29.325572 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" event={"ID":"c1e07a33-6b17-400a-9697-f6746b257c3b","Type":"ContainerStarted","Data":"d13654af78793452fec2f6f9b853ada8e5aac5978a4a47d3922be4ad81917f27"} Feb 19 19:21:29 crc kubenswrapper[4722]: I0219 19:21:29.325651 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" event={"ID":"c1e07a33-6b17-400a-9697-f6746b257c3b","Type":"ContainerStarted","Data":"dbd20b6d66e3f4f6e61fa79b316492ac3959c655c54ad91cf639b4c0480d6e0e"} Feb 19 19:21:29 crc kubenswrapper[4722]: I0219 19:21:29.325833 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:29 crc kubenswrapper[4722]: I0219 19:21:29.331411 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:29 crc kubenswrapper[4722]: I0219 19:21:29.342366 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" podStartSLOduration=20.342346302 podStartE2EDuration="20.342346302s" podCreationTimestamp="2026-02-19 19:21:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:21:29.341315479 +0000 UTC m=+188.953665823" watchObservedRunningTime="2026-02-19 19:21:29.342346302 +0000 UTC m=+188.954696626" Feb 19 19:21:29 crc kubenswrapper[4722]: I0219 19:21:29.431302 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c59b95f4-rjjbc"] Feb 19 19:21:29 crc kubenswrapper[4722]: I0219 19:21:29.432086 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" podUID="c0b15088-b052-4d3f-adca-61ff969d0699" containerName="controller-manager" containerID="cri-o://c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89" gracePeriod=30 Feb 19 19:21:29 crc kubenswrapper[4722]: I0219 19:21:29.878082 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.002833 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-proxy-ca-bundles\") pod \"c0b15088-b052-4d3f-adca-61ff969d0699\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.002914 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8ldd\" (UniqueName: \"kubernetes.io/projected/c0b15088-b052-4d3f-adca-61ff969d0699-kube-api-access-l8ldd\") pod \"c0b15088-b052-4d3f-adca-61ff969d0699\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.002960 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-config\") pod \"c0b15088-b052-4d3f-adca-61ff969d0699\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.003033 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-client-ca\") pod \"c0b15088-b052-4d3f-adca-61ff969d0699\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.003088 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0b15088-b052-4d3f-adca-61ff969d0699-serving-cert\") pod \"c0b15088-b052-4d3f-adca-61ff969d0699\" (UID: \"c0b15088-b052-4d3f-adca-61ff969d0699\") " Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.003894 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-client-ca" (OuterVolumeSpecName: "client-ca") pod "c0b15088-b052-4d3f-adca-61ff969d0699" (UID: "c0b15088-b052-4d3f-adca-61ff969d0699"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.004009 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c0b15088-b052-4d3f-adca-61ff969d0699" (UID: "c0b15088-b052-4d3f-adca-61ff969d0699"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.004016 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-config" (OuterVolumeSpecName: "config") pod "c0b15088-b052-4d3f-adca-61ff969d0699" (UID: "c0b15088-b052-4d3f-adca-61ff969d0699"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.008626 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0b15088-b052-4d3f-adca-61ff969d0699-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c0b15088-b052-4d3f-adca-61ff969d0699" (UID: "c0b15088-b052-4d3f-adca-61ff969d0699"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.009374 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0b15088-b052-4d3f-adca-61ff969d0699-kube-api-access-l8ldd" (OuterVolumeSpecName: "kube-api-access-l8ldd") pod "c0b15088-b052-4d3f-adca-61ff969d0699" (UID: "c0b15088-b052-4d3f-adca-61ff969d0699"). InnerVolumeSpecName "kube-api-access-l8ldd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.104085 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.104115 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c0b15088-b052-4d3f-adca-61ff969d0699-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.104124 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.104134 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8ldd\" (UniqueName: \"kubernetes.io/projected/c0b15088-b052-4d3f-adca-61ff969d0699-kube-api-access-l8ldd\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.104143 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0b15088-b052-4d3f-adca-61ff969d0699-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.331516 4722 generic.go:334] "Generic (PLEG): container finished" podID="c0b15088-b052-4d3f-adca-61ff969d0699" containerID="c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89" exitCode=0 Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.331610 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.331623 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" event={"ID":"c0b15088-b052-4d3f-adca-61ff969d0699","Type":"ContainerDied","Data":"c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89"} Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.331674 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c59b95f4-rjjbc" event={"ID":"c0b15088-b052-4d3f-adca-61ff969d0699","Type":"ContainerDied","Data":"ed1914331bde3da0380e958185e797f7bfc00f39c4f87c91874487af65b25caf"} Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.331692 4722 scope.go:117] "RemoveContainer" containerID="c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.362270 4722 scope.go:117] "RemoveContainer" containerID="c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89" Feb 19 19:21:30 crc kubenswrapper[4722]: E0219 19:21:30.363266 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89\": container with ID starting with c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89 not found: ID does not exist" containerID="c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.363322 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89"} err="failed to get container status \"c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89\": rpc error: code = NotFound desc = could not find container \"c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89\": container with ID starting with c19900bbfd828be8725a03ab4c71c5c1a63c2afc781a3ccae04d67da6fca1d89 not found: ID does not exist" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.367391 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c59b95f4-rjjbc"] Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.369850 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c59b95f4-rjjbc"] Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.542228 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr"] Feb 19 19:21:30 crc kubenswrapper[4722]: E0219 19:21:30.543506 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0b15088-b052-4d3f-adca-61ff969d0699" containerName="controller-manager" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.543585 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b15088-b052-4d3f-adca-61ff969d0699" containerName="controller-manager" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.543721 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0b15088-b052-4d3f-adca-61ff969d0699" containerName="controller-manager" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.545692 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.547839 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.548103 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.548158 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.549492 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.550128 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.550375 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.552393 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr"] Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.554824 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.713250 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp4vf\" (UniqueName: \"kubernetes.io/projected/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-kube-api-access-xp4vf\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.713355 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-client-ca\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.713397 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-config\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.713420 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-proxy-ca-bundles\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.713461 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-serving-cert\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.814722 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-client-ca\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.815257 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-config\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.815355 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-proxy-ca-bundles\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.815449 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-serving-cert\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.815534 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp4vf\" (UniqueName: \"kubernetes.io/projected/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-kube-api-access-xp4vf\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.816907 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-proxy-ca-bundles\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.817274 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-client-ca\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.817970 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-config\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.819109 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-serving-cert\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.840518 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp4vf\" (UniqueName: \"kubernetes.io/projected/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-kube-api-access-xp4vf\") pod \"controller-manager-7dc7c9d8c5-cvcxr\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:30 crc kubenswrapper[4722]: I0219 19:21:30.898377 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.086746 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0b15088-b052-4d3f-adca-61ff969d0699" path="/var/lib/kubelet/pods/c0b15088-b052-4d3f-adca-61ff969d0699/volumes" Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.091277 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr"] Feb 19 19:21:31 crc kubenswrapper[4722]: W0219 19:21:31.112375 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47caee59_bfc1_4d8b_89f9_f7e9dc92c22c.slice/crio-eb216779603dda494347e8df3b7d3be7b8147bee991b2254902a010452dbfd0d WatchSource:0}: Error finding container eb216779603dda494347e8df3b7d3be7b8147bee991b2254902a010452dbfd0d: Status 404 returned error can't find the container with id eb216779603dda494347e8df3b7d3be7b8147bee991b2254902a010452dbfd0d Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.336680 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" event={"ID":"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c","Type":"ContainerStarted","Data":"7c94ac163e317758dcb74268b82ce04e30cef8972d812bb8bc2cc38f6fa20bfc"} Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.337022 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.337033 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" event={"ID":"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c","Type":"ContainerStarted","Data":"eb216779603dda494347e8df3b7d3be7b8147bee991b2254902a010452dbfd0d"} Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.341981 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.355389 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" podStartSLOduration=2.355375027 podStartE2EDuration="2.355375027s" podCreationTimestamp="2026-02-19 19:21:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:21:31.354254471 +0000 UTC m=+190.966604815" watchObservedRunningTime="2026-02-19 19:21:31.355375027 +0000 UTC m=+190.967725351" Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.867733 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.868521 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.870063 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.874917 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 19:21:31 crc kubenswrapper[4722]: I0219 19:21:31.874949 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 19:21:32 crc kubenswrapper[4722]: I0219 19:21:32.031039 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"df91f980-2d12-4c18-8f5f-91bf1a5b4136\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 19:21:32 crc kubenswrapper[4722]: I0219 19:21:32.031139 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"df91f980-2d12-4c18-8f5f-91bf1a5b4136\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 19:21:32 crc kubenswrapper[4722]: I0219 19:21:32.132152 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"df91f980-2d12-4c18-8f5f-91bf1a5b4136\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 19:21:32 crc kubenswrapper[4722]: I0219 19:21:32.132262 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"df91f980-2d12-4c18-8f5f-91bf1a5b4136\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 19:21:32 crc kubenswrapper[4722]: I0219 19:21:32.132276 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"df91f980-2d12-4c18-8f5f-91bf1a5b4136\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 19:21:32 crc kubenswrapper[4722]: I0219 19:21:32.152242 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"df91f980-2d12-4c18-8f5f-91bf1a5b4136\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 19:21:32 crc kubenswrapper[4722]: I0219 19:21:32.189401 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 19:21:32 crc kubenswrapper[4722]: I0219 19:21:32.606081 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 19:21:32 crc kubenswrapper[4722]: W0219 19:21:32.625243 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddf91f980_2d12_4c18_8f5f_91bf1a5b4136.slice/crio-2654ea996ef6dd054f2c069289e96624e15d91468b09b04dc139c987f84cae88 WatchSource:0}: Error finding container 2654ea996ef6dd054f2c069289e96624e15d91468b09b04dc139c987f84cae88: Status 404 returned error can't find the container with id 2654ea996ef6dd054f2c069289e96624e15d91468b09b04dc139c987f84cae88 Feb 19 19:21:32 crc kubenswrapper[4722]: I0219 19:21:32.848054 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:21:32 crc kubenswrapper[4722]: I0219 19:21:32.848107 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:21:33 crc kubenswrapper[4722]: I0219 19:21:33.347941 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"df91f980-2d12-4c18-8f5f-91bf1a5b4136","Type":"ContainerStarted","Data":"0da74604871e34120ab12154e4c46a70f3a348702b25e1082ee896814de85bc4"} Feb 19 19:21:33 crc kubenswrapper[4722]: I0219 19:21:33.348261 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"df91f980-2d12-4c18-8f5f-91bf1a5b4136","Type":"ContainerStarted","Data":"2654ea996ef6dd054f2c069289e96624e15d91468b09b04dc139c987f84cae88"} Feb 19 19:21:33 crc kubenswrapper[4722]: I0219 19:21:33.998960 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rnljk" podUID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerName="registry-server" probeResult="failure" output=< Feb 19 19:21:33 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 19 19:21:33 crc kubenswrapper[4722]: > Feb 19 19:21:34 crc kubenswrapper[4722]: I0219 19:21:34.354012 4722 generic.go:334] "Generic (PLEG): container finished" podID="df91f980-2d12-4c18-8f5f-91bf1a5b4136" containerID="0da74604871e34120ab12154e4c46a70f3a348702b25e1082ee896814de85bc4" exitCode=0 Feb 19 19:21:34 crc kubenswrapper[4722]: I0219 19:21:34.354055 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"df91f980-2d12-4c18-8f5f-91bf1a5b4136","Type":"ContainerDied","Data":"0da74604871e34120ab12154e4c46a70f3a348702b25e1082ee896814de85bc4"} Feb 19 19:21:35 crc kubenswrapper[4722]: I0219 19:21:35.652091 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 19:21:35 crc kubenswrapper[4722]: I0219 19:21:35.785284 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kubelet-dir\") pod \"df91f980-2d12-4c18-8f5f-91bf1a5b4136\" (UID: \"df91f980-2d12-4c18-8f5f-91bf1a5b4136\") " Feb 19 19:21:35 crc kubenswrapper[4722]: I0219 19:21:35.785366 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kube-api-access\") pod \"df91f980-2d12-4c18-8f5f-91bf1a5b4136\" (UID: \"df91f980-2d12-4c18-8f5f-91bf1a5b4136\") " Feb 19 19:21:35 crc kubenswrapper[4722]: I0219 19:21:35.785388 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "df91f980-2d12-4c18-8f5f-91bf1a5b4136" (UID: "df91f980-2d12-4c18-8f5f-91bf1a5b4136"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:21:35 crc kubenswrapper[4722]: I0219 19:21:35.785549 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:35 crc kubenswrapper[4722]: I0219 19:21:35.790948 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "df91f980-2d12-4c18-8f5f-91bf1a5b4136" (UID: "df91f980-2d12-4c18-8f5f-91bf1a5b4136"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:35 crc kubenswrapper[4722]: I0219 19:21:35.886844 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df91f980-2d12-4c18-8f5f-91bf1a5b4136-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.366851 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"df91f980-2d12-4c18-8f5f-91bf1a5b4136","Type":"ContainerDied","Data":"2654ea996ef6dd054f2c069289e96624e15d91468b09b04dc139c987f84cae88"} Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.366903 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2654ea996ef6dd054f2c069289e96624e15d91468b09b04dc139c987f84cae88" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.366970 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.465131 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 19:21:36 crc kubenswrapper[4722]: E0219 19:21:36.465517 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df91f980-2d12-4c18-8f5f-91bf1a5b4136" containerName="pruner" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.465545 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="df91f980-2d12-4c18-8f5f-91bf1a5b4136" containerName="pruner" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.465736 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="df91f980-2d12-4c18-8f5f-91bf1a5b4136" containerName="pruner" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.466356 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.468412 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.471081 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.477231 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.595081 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-var-lock\") pod \"installer-9-crc\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.595126 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kube-api-access\") pod \"installer-9-crc\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.595219 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.696648 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.696733 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-var-lock\") pod \"installer-9-crc\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.696763 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kube-api-access\") pod \"installer-9-crc\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.697320 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.697377 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-var-lock\") pod \"installer-9-crc\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.714436 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kube-api-access\") pod \"installer-9-crc\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:36 crc kubenswrapper[4722]: I0219 19:21:36.791570 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:21:37 crc kubenswrapper[4722]: I0219 19:21:37.175533 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 19:21:37 crc kubenswrapper[4722]: I0219 19:21:37.373094 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d98fac92-53aa-469c-b47e-4cc6edd91ef7","Type":"ContainerStarted","Data":"be8a061b191347417c7eff0e39c1d45a40ce52746371e25938f78f0f9a4f9e58"} Feb 19 19:21:38 crc kubenswrapper[4722]: I0219 19:21:38.379205 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d98fac92-53aa-469c-b47e-4cc6edd91ef7","Type":"ContainerStarted","Data":"1f68a7c9928e93107f9848c5151b976a7aa149617e7e965be09dba7a86508ed6"} Feb 19 19:21:38 crc kubenswrapper[4722]: I0219 19:21:38.394256 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.3942385809999998 podStartE2EDuration="2.394238581s" podCreationTimestamp="2026-02-19 19:21:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:21:38.391609158 +0000 UTC m=+198.003959532" watchObservedRunningTime="2026-02-19 19:21:38.394238581 +0000 UTC m=+198.006588925" Feb 19 19:21:39 crc kubenswrapper[4722]: I0219 19:21:39.386414 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tp9x" event={"ID":"396bbbdf-7f78-48e7-b02c-0737c221aaa6","Type":"ContainerStarted","Data":"df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081"} Feb 19 19:21:39 crc kubenswrapper[4722]: I0219 19:21:39.389132 4722 generic.go:334] "Generic (PLEG): container finished" podID="0c9d3632-a132-4377-95ef-564cffb1f299" containerID="ecdd2f0fffaf519cc5830b6edc00c3c6f8ed2646ef4460850d3ebbfc25bad88c" exitCode=0 Feb 19 19:21:39 crc kubenswrapper[4722]: I0219 19:21:39.389169 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64frs" event={"ID":"0c9d3632-a132-4377-95ef-564cffb1f299","Type":"ContainerDied","Data":"ecdd2f0fffaf519cc5830b6edc00c3c6f8ed2646ef4460850d3ebbfc25bad88c"} Feb 19 19:21:39 crc kubenswrapper[4722]: I0219 19:21:39.391198 4722 generic.go:334] "Generic (PLEG): container finished" podID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerID="fed968269de56954a9bf853304185d7d7e89b05c7032995e1f8430c840f32748" exitCode=0 Feb 19 19:21:39 crc kubenswrapper[4722]: I0219 19:21:39.391589 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vqqrf" event={"ID":"f10dae1c-d938-4cce-893b-4ad7eca7d23f","Type":"ContainerDied","Data":"fed968269de56954a9bf853304185d7d7e89b05c7032995e1f8430c840f32748"} Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.399367 4722 generic.go:334] "Generic (PLEG): container finished" podID="c594681e-de0b-4b39-98d3-573c9170c898" containerID="66cd55d7e5fc27ab50c52a8a0d368159c8c115d8bef1d54037565d69fb207dbc" exitCode=0 Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.399739 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j86kw" event={"ID":"c594681e-de0b-4b39-98d3-573c9170c898","Type":"ContainerDied","Data":"66cd55d7e5fc27ab50c52a8a0d368159c8c115d8bef1d54037565d69fb207dbc"} Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.402757 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vqqrf" event={"ID":"f10dae1c-d938-4cce-893b-4ad7eca7d23f","Type":"ContainerStarted","Data":"5ad81a5a39e1d2d4c131bcf5c486bacca24698453f66dd8aa32cd630c49e4b9c"} Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.406453 4722 generic.go:334] "Generic (PLEG): container finished" podID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerID="df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081" exitCode=0 Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.406532 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tp9x" event={"ID":"396bbbdf-7f78-48e7-b02c-0737c221aaa6","Type":"ContainerDied","Data":"df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081"} Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.408786 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4tk99" event={"ID":"12054322-fe1e-4205-b6d3-05b30024a987","Type":"ContainerStarted","Data":"ddd3ca27c25bd3be69324b8ea80fd859b1d4f9a489ef9ea86f39a650b78fd038"} Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.411098 4722 generic.go:334] "Generic (PLEG): container finished" podID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerID="2aa095dd8f535949977c905c9b49fee93638ecf8347aa83cac60afa0f336cc86" exitCode=0 Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.411212 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4576" event={"ID":"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe","Type":"ContainerDied","Data":"2aa095dd8f535949977c905c9b49fee93638ecf8347aa83cac60afa0f336cc86"} Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.414058 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64frs" event={"ID":"0c9d3632-a132-4377-95ef-564cffb1f299","Type":"ContainerStarted","Data":"d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5"} Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.454428 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-64frs" podStartSLOduration=2.20180181 podStartE2EDuration="51.454410737s" podCreationTimestamp="2026-02-19 19:20:49 +0000 UTC" firstStartedPulling="2026-02-19 19:20:50.644310706 +0000 UTC m=+150.256661030" lastFinishedPulling="2026-02-19 19:21:39.896919643 +0000 UTC m=+199.509269957" observedRunningTime="2026-02-19 19:21:40.453929061 +0000 UTC m=+200.066279385" watchObservedRunningTime="2026-02-19 19:21:40.454410737 +0000 UTC m=+200.066761061" Feb 19 19:21:40 crc kubenswrapper[4722]: I0219 19:21:40.506666 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vqqrf" podStartSLOduration=2.320538432 podStartE2EDuration="49.506645892s" podCreationTimestamp="2026-02-19 19:20:51 +0000 UTC" firstStartedPulling="2026-02-19 19:20:52.700302359 +0000 UTC m=+152.312652683" lastFinishedPulling="2026-02-19 19:21:39.886409819 +0000 UTC m=+199.498760143" observedRunningTime="2026-02-19 19:21:40.489304162 +0000 UTC m=+200.101654496" watchObservedRunningTime="2026-02-19 19:21:40.506645892 +0000 UTC m=+200.118996216" Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.420679 4722 generic.go:334] "Generic (PLEG): container finished" podID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerID="d8aaa67a4ff9066de0c0fee741280169063042f7cb7d5dafb2624fc9902e5310" exitCode=0 Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.420762 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hg6kw" event={"ID":"7ad9ab6b-efbe-4d01-97b0-281ee8a199df","Type":"ContainerDied","Data":"d8aaa67a4ff9066de0c0fee741280169063042f7cb7d5dafb2624fc9902e5310"} Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.422920 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j86kw" event={"ID":"c594681e-de0b-4b39-98d3-573c9170c898","Type":"ContainerStarted","Data":"e842cd93c8af4269ffab8a136762be28cba28d7ca69bf398c59e72796317d60c"} Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.425235 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tp9x" event={"ID":"396bbbdf-7f78-48e7-b02c-0737c221aaa6","Type":"ContainerStarted","Data":"8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75"} Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.426772 4722 generic.go:334] "Generic (PLEG): container finished" podID="12054322-fe1e-4205-b6d3-05b30024a987" containerID="ddd3ca27c25bd3be69324b8ea80fd859b1d4f9a489ef9ea86f39a650b78fd038" exitCode=0 Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.426803 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4tk99" event={"ID":"12054322-fe1e-4205-b6d3-05b30024a987","Type":"ContainerDied","Data":"ddd3ca27c25bd3be69324b8ea80fd859b1d4f9a489ef9ea86f39a650b78fd038"} Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.430037 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4576" event={"ID":"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe","Type":"ContainerStarted","Data":"59abb000514be8d9f59000c1e9c4b40a7fed4fed6d9e61216969f53d819ffdef"} Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.467464 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j86kw" podStartSLOduration=2.305488215 podStartE2EDuration="52.467448264s" podCreationTimestamp="2026-02-19 19:20:49 +0000 UTC" firstStartedPulling="2026-02-19 19:20:50.648020574 +0000 UTC m=+150.260370898" lastFinishedPulling="2026-02-19 19:21:40.809980623 +0000 UTC m=+200.422330947" observedRunningTime="2026-02-19 19:21:41.465194432 +0000 UTC m=+201.077544776" watchObservedRunningTime="2026-02-19 19:21:41.467448264 +0000 UTC m=+201.079798588" Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.491606 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p4576" podStartSLOduration=3.274298016 podStartE2EDuration="52.491590959s" podCreationTimestamp="2026-02-19 19:20:49 +0000 UTC" firstStartedPulling="2026-02-19 19:20:51.669812753 +0000 UTC m=+151.282163077" lastFinishedPulling="2026-02-19 19:21:40.887105706 +0000 UTC m=+200.499456020" observedRunningTime="2026-02-19 19:21:41.49070664 +0000 UTC m=+201.103056964" watchObservedRunningTime="2026-02-19 19:21:41.491590959 +0000 UTC m=+201.103941283" Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.514461 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6tp9x" podStartSLOduration=2.345124762 podStartE2EDuration="52.514444563s" podCreationTimestamp="2026-02-19 19:20:49 +0000 UTC" firstStartedPulling="2026-02-19 19:20:50.6519845 +0000 UTC m=+150.264334824" lastFinishedPulling="2026-02-19 19:21:40.821304301 +0000 UTC m=+200.433654625" observedRunningTime="2026-02-19 19:21:41.509995171 +0000 UTC m=+201.122345495" watchObservedRunningTime="2026-02-19 19:21:41.514444563 +0000 UTC m=+201.126794887" Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.798380 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.798432 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.798476 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.798967 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.799037 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d" gracePeriod=600 Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.836317 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:21:41 crc kubenswrapper[4722]: I0219 19:21:41.836361 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:21:42 crc kubenswrapper[4722]: I0219 19:21:42.436930 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hg6kw" event={"ID":"7ad9ab6b-efbe-4d01-97b0-281ee8a199df","Type":"ContainerStarted","Data":"8993afef8511380dc6814c043e53efeb7c1d8df71314aae95c262ecad6010502"} Feb 19 19:21:42 crc kubenswrapper[4722]: I0219 19:21:42.438645 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4tk99" event={"ID":"12054322-fe1e-4205-b6d3-05b30024a987","Type":"ContainerStarted","Data":"2f3f9d21eca082120541810f90cc0e416e5125a66419fec7ce931b180b25c24e"} Feb 19 19:21:42 crc kubenswrapper[4722]: I0219 19:21:42.440426 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d" exitCode=0 Feb 19 19:21:42 crc kubenswrapper[4722]: I0219 19:21:42.440460 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d"} Feb 19 19:21:42 crc kubenswrapper[4722]: I0219 19:21:42.440484 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"793b4919ac9772a89f95b2b76957a7ffe6ea089b9abb948aa9c7330908d0f312"} Feb 19 19:21:42 crc kubenswrapper[4722]: I0219 19:21:42.457227 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hg6kw" podStartSLOduration=2.355493687 podStartE2EDuration="51.457212384s" podCreationTimestamp="2026-02-19 19:20:51 +0000 UTC" firstStartedPulling="2026-02-19 19:20:52.691902421 +0000 UTC m=+152.304252745" lastFinishedPulling="2026-02-19 19:21:41.793621118 +0000 UTC m=+201.405971442" observedRunningTime="2026-02-19 19:21:42.453590849 +0000 UTC m=+202.065941173" watchObservedRunningTime="2026-02-19 19:21:42.457212384 +0000 UTC m=+202.069562708" Feb 19 19:21:42 crc kubenswrapper[4722]: I0219 19:21:42.502382 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4tk99" podStartSLOduration=3.384705328 podStartE2EDuration="50.502362344s" podCreationTimestamp="2026-02-19 19:20:52 +0000 UTC" firstStartedPulling="2026-02-19 19:20:54.732913417 +0000 UTC m=+154.345263751" lastFinishedPulling="2026-02-19 19:21:41.850570443 +0000 UTC m=+201.462920767" observedRunningTime="2026-02-19 19:21:42.501669902 +0000 UTC m=+202.114020226" watchObservedRunningTime="2026-02-19 19:21:42.502362344 +0000 UTC m=+202.114712668" Feb 19 19:21:42 crc kubenswrapper[4722]: I0219 19:21:42.901271 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:21:42 crc kubenswrapper[4722]: I0219 19:21:42.943355 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-vqqrf" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerName="registry-server" probeResult="failure" output=< Feb 19 19:21:42 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 19 19:21:42 crc kubenswrapper[4722]: > Feb 19 19:21:42 crc kubenswrapper[4722]: I0219 19:21:42.951120 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:21:43 crc kubenswrapper[4722]: I0219 19:21:43.243445 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:21:43 crc kubenswrapper[4722]: I0219 19:21:43.243521 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:21:44 crc kubenswrapper[4722]: I0219 19:21:44.281474 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4tk99" podUID="12054322-fe1e-4205-b6d3-05b30024a987" containerName="registry-server" probeResult="failure" output=< Feb 19 19:21:44 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 19 19:21:44 crc kubenswrapper[4722]: > Feb 19 19:21:49 crc kubenswrapper[4722]: I0219 19:21:49.410708 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr"] Feb 19 19:21:49 crc kubenswrapper[4722]: I0219 19:21:49.411553 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" podUID="47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" containerName="controller-manager" containerID="cri-o://7c94ac163e317758dcb74268b82ce04e30cef8972d812bb8bc2cc38f6fa20bfc" gracePeriod=30 Feb 19 19:21:49 crc kubenswrapper[4722]: I0219 19:21:49.428333 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g"] Feb 19 19:21:49 crc kubenswrapper[4722]: I0219 19:21:49.428970 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" podUID="c1e07a33-6b17-400a-9697-f6746b257c3b" containerName="route-controller-manager" containerID="cri-o://d13654af78793452fec2f6f9b853ada8e5aac5978a4a47d3922be4ad81917f27" gracePeriod=30 Feb 19 19:21:49 crc kubenswrapper[4722]: I0219 19:21:49.666889 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-64frs" Feb 19 19:21:49 crc kubenswrapper[4722]: I0219 19:21:49.667246 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-64frs" Feb 19 19:21:49 crc kubenswrapper[4722]: I0219 19:21:49.707661 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-64frs" Feb 19 19:21:49 crc kubenswrapper[4722]: I0219 19:21:49.859684 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:21:49 crc kubenswrapper[4722]: I0219 19:21:49.859723 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:21:49 crc kubenswrapper[4722]: I0219 19:21:49.922273 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.061749 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.061803 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.136328 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.293764 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.293835 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.364053 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.487366 4722 generic.go:334] "Generic (PLEG): container finished" podID="c1e07a33-6b17-400a-9697-f6746b257c3b" containerID="d13654af78793452fec2f6f9b853ada8e5aac5978a4a47d3922be4ad81917f27" exitCode=0 Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.487983 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" event={"ID":"c1e07a33-6b17-400a-9697-f6746b257c3b","Type":"ContainerDied","Data":"d13654af78793452fec2f6f9b853ada8e5aac5978a4a47d3922be4ad81917f27"} Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.489253 4722 generic.go:334] "Generic (PLEG): container finished" podID="47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" containerID="7c94ac163e317758dcb74268b82ce04e30cef8972d812bb8bc2cc38f6fa20bfc" exitCode=0 Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.489492 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" event={"ID":"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c","Type":"ContainerDied","Data":"7c94ac163e317758dcb74268b82ce04e30cef8972d812bb8bc2cc38f6fa20bfc"} Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.532241 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.541382 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-64frs" Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.542728 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:21:50 crc kubenswrapper[4722]: I0219 19:21:50.550342 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.005647 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.006658 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-proxy-ca-bundles\") pod \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.006734 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-serving-cert\") pod \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.006798 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-client-ca\") pod \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.006855 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-config\") pod \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.006951 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp4vf\" (UniqueName: \"kubernetes.io/projected/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-kube-api-access-xp4vf\") pod \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\" (UID: \"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c\") " Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.007343 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" (UID: "47caee59-bfc1-4d8b-89f9-f7e9dc92c22c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.007403 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-client-ca" (OuterVolumeSpecName: "client-ca") pod "47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" (UID: "47caee59-bfc1-4d8b-89f9-f7e9dc92c22c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.007602 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-config" (OuterVolumeSpecName: "config") pod "47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" (UID: "47caee59-bfc1-4d8b-89f9-f7e9dc92c22c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.011981 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-kube-api-access-xp4vf" (OuterVolumeSpecName: "kube-api-access-xp4vf") pod "47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" (UID: "47caee59-bfc1-4d8b-89f9-f7e9dc92c22c"). InnerVolumeSpecName "kube-api-access-xp4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.012244 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.020406 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" (UID: "47caee59-bfc1-4d8b-89f9-f7e9dc92c22c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.053757 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c79b9864-zpjrn"] Feb 19 19:21:51 crc kubenswrapper[4722]: E0219 19:21:51.054258 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" containerName="controller-manager" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.054286 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" containerName="controller-manager" Feb 19 19:21:51 crc kubenswrapper[4722]: E0219 19:21:51.054318 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1e07a33-6b17-400a-9697-f6746b257c3b" containerName="route-controller-manager" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.054326 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1e07a33-6b17-400a-9697-f6746b257c3b" containerName="route-controller-manager" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.056521 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1e07a33-6b17-400a-9697-f6746b257c3b" containerName="route-controller-manager" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.056583 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" containerName="controller-manager" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.068815 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.069523 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c79b9864-zpjrn"] Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108196 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-client-ca\") pod \"c1e07a33-6b17-400a-9697-f6746b257c3b\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108285 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8zpn\" (UniqueName: \"kubernetes.io/projected/c1e07a33-6b17-400a-9697-f6746b257c3b-kube-api-access-m8zpn\") pod \"c1e07a33-6b17-400a-9697-f6746b257c3b\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108324 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1e07a33-6b17-400a-9697-f6746b257c3b-serving-cert\") pod \"c1e07a33-6b17-400a-9697-f6746b257c3b\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108359 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-config\") pod \"c1e07a33-6b17-400a-9697-f6746b257c3b\" (UID: \"c1e07a33-6b17-400a-9697-f6746b257c3b\") " Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108497 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-config\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108554 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-client-ca\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108610 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-proxy-ca-bundles\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108658 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7wvb\" (UniqueName: \"kubernetes.io/projected/0b05f432-113b-41f2-8c75-ec167057d648-kube-api-access-x7wvb\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108781 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b05f432-113b-41f2-8c75-ec167057d648-serving-cert\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108899 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp4vf\" (UniqueName: \"kubernetes.io/projected/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-kube-api-access-xp4vf\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108932 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108951 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108968 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.108984 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.109365 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-client-ca" (OuterVolumeSpecName: "client-ca") pod "c1e07a33-6b17-400a-9697-f6746b257c3b" (UID: "c1e07a33-6b17-400a-9697-f6746b257c3b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.110841 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-config" (OuterVolumeSpecName: "config") pod "c1e07a33-6b17-400a-9697-f6746b257c3b" (UID: "c1e07a33-6b17-400a-9697-f6746b257c3b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.123745 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1e07a33-6b17-400a-9697-f6746b257c3b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c1e07a33-6b17-400a-9697-f6746b257c3b" (UID: "c1e07a33-6b17-400a-9697-f6746b257c3b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.123798 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1e07a33-6b17-400a-9697-f6746b257c3b-kube-api-access-m8zpn" (OuterVolumeSpecName: "kube-api-access-m8zpn") pod "c1e07a33-6b17-400a-9697-f6746b257c3b" (UID: "c1e07a33-6b17-400a-9697-f6746b257c3b"). InnerVolumeSpecName "kube-api-access-m8zpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.209901 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-client-ca\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.209994 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-proxy-ca-bundles\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.210034 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7wvb\" (UniqueName: \"kubernetes.io/projected/0b05f432-113b-41f2-8c75-ec167057d648-kube-api-access-x7wvb\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.210134 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b05f432-113b-41f2-8c75-ec167057d648-serving-cert\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.210212 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-config\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.210274 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.210293 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8zpn\" (UniqueName: \"kubernetes.io/projected/c1e07a33-6b17-400a-9697-f6746b257c3b-kube-api-access-m8zpn\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.210310 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1e07a33-6b17-400a-9697-f6746b257c3b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.210325 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1e07a33-6b17-400a-9697-f6746b257c3b-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.211125 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-client-ca\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.211447 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-proxy-ca-bundles\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.211754 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-config\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.213851 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b05f432-113b-41f2-8c75-ec167057d648-serving-cert\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.228965 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7wvb\" (UniqueName: \"kubernetes.io/projected/0b05f432-113b-41f2-8c75-ec167057d648-kube-api-access-x7wvb\") pod \"controller-manager-6c79b9864-zpjrn\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.381801 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.504801 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" event={"ID":"c1e07a33-6b17-400a-9697-f6746b257c3b","Type":"ContainerDied","Data":"dbd20b6d66e3f4f6e61fa79b316492ac3959c655c54ad91cf639b4c0480d6e0e"} Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.504830 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.504859 4722 scope.go:117] "RemoveContainer" containerID="d13654af78793452fec2f6f9b853ada8e5aac5978a4a47d3922be4ad81917f27" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.521979 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.522887 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" event={"ID":"47caee59-bfc1-4d8b-89f9-f7e9dc92c22c","Type":"ContainerDied","Data":"eb216779603dda494347e8df3b7d3be7b8147bee991b2254902a010452dbfd0d"} Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.542618 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p4576"] Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.562135 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr"] Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.566293 4722 scope.go:117] "RemoveContainer" containerID="7c94ac163e317758dcb74268b82ce04e30cef8972d812bb8bc2cc38f6fa20bfc" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.567501 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr"] Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.572411 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g"] Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.576587 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8d8fd67f4-czb9g"] Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.648494 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c79b9864-zpjrn"] Feb 19 19:21:51 crc kubenswrapper[4722]: W0219 19:21:51.654602 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b05f432_113b_41f2_8c75_ec167057d648.slice/crio-9c530f54177f0ff087e91411299d1317bfd8535f2cd89ca09d2b6c3da2e9b3cc WatchSource:0}: Error finding container 9c530f54177f0ff087e91411299d1317bfd8535f2cd89ca09d2b6c3da2e9b3cc: Status 404 returned error can't find the container with id 9c530f54177f0ff087e91411299d1317bfd8535f2cd89ca09d2b6c3da2e9b3cc Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.888410 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.900218 4722 patch_prober.go:28] interesting pod/controller-manager-7dc7c9d8c5-cvcxr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.900266 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7dc7c9d8c5-cvcxr" podUID="47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 19:21:51 crc kubenswrapper[4722]: I0219 19:21:51.946524 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:21:52 crc kubenswrapper[4722]: I0219 19:21:52.231010 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:21:52 crc kubenswrapper[4722]: I0219 19:21:52.231366 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:21:52 crc kubenswrapper[4722]: I0219 19:21:52.530634 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" event={"ID":"0b05f432-113b-41f2-8c75-ec167057d648","Type":"ContainerStarted","Data":"125ae16b1eac7eba451bf3116e5570f9b87ca26d1938c5f7d79a5305a7cb39d5"} Feb 19 19:21:52 crc kubenswrapper[4722]: I0219 19:21:52.530704 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" event={"ID":"0b05f432-113b-41f2-8c75-ec167057d648","Type":"ContainerStarted","Data":"9c530f54177f0ff087e91411299d1317bfd8535f2cd89ca09d2b6c3da2e9b3cc"} Feb 19 19:21:52 crc kubenswrapper[4722]: I0219 19:21:52.532906 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p4576" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerName="registry-server" containerID="cri-o://59abb000514be8d9f59000c1e9c4b40a7fed4fed6d9e61216969f53d819ffdef" gracePeriod=2 Feb 19 19:21:52 crc kubenswrapper[4722]: I0219 19:21:52.536606 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j86kw"] Feb 19 19:21:52 crc kubenswrapper[4722]: I0219 19:21:52.536823 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j86kw" podUID="c594681e-de0b-4b39-98d3-573c9170c898" containerName="registry-server" containerID="cri-o://e842cd93c8af4269ffab8a136762be28cba28d7ca69bf398c59e72796317d60c" gracePeriod=2 Feb 19 19:21:52 crc kubenswrapper[4722]: I0219 19:21:52.649364 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:21:52 crc kubenswrapper[4722]: E0219 19:21:52.901476 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc594681e_de0b_4b39_98d3_573c9170c898.slice/crio-e842cd93c8af4269ffab8a136762be28cba28d7ca69bf398c59e72796317d60c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3947bdb_5e5a_43c8_b23d_d5aa97ebaebe.slice/crio-59abb000514be8d9f59000c1e9c4b40a7fed4fed6d9e61216969f53d819ffdef.scope\": RecentStats: unable to find data in memory cache]" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.079010 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47caee59-bfc1-4d8b-89f9-f7e9dc92c22c" path="/var/lib/kubelet/pods/47caee59-bfc1-4d8b-89f9-f7e9dc92c22c/volumes" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.080325 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1e07a33-6b17-400a-9697-f6746b257c3b" path="/var/lib/kubelet/pods/c1e07a33-6b17-400a-9697-f6746b257c3b/volumes" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.334695 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.372399 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.543935 4722 generic.go:334] "Generic (PLEG): container finished" podID="c594681e-de0b-4b39-98d3-573c9170c898" containerID="e842cd93c8af4269ffab8a136762be28cba28d7ca69bf398c59e72796317d60c" exitCode=0 Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.543982 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j86kw" event={"ID":"c594681e-de0b-4b39-98d3-573c9170c898","Type":"ContainerDied","Data":"e842cd93c8af4269ffab8a136762be28cba28d7ca69bf398c59e72796317d60c"} Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.546697 4722 generic.go:334] "Generic (PLEG): container finished" podID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerID="59abb000514be8d9f59000c1e9c4b40a7fed4fed6d9e61216969f53d819ffdef" exitCode=0 Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.546722 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4576" event={"ID":"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe","Type":"ContainerDied","Data":"59abb000514be8d9f59000c1e9c4b40a7fed4fed6d9e61216969f53d819ffdef"} Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.565978 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v"] Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.566284 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" podStartSLOduration=4.566265236 podStartE2EDuration="4.566265236s" podCreationTimestamp="2026-02-19 19:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:21:53.564259193 +0000 UTC m=+213.176609517" watchObservedRunningTime="2026-02-19 19:21:53.566265236 +0000 UTC m=+213.178615570" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.567042 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.571332 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.571625 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.571646 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.573733 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.573759 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.574024 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.581316 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v"] Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.613320 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.638276 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-config\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.739853 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22babc4-c86d-4152-8113-84595c89b271-serving-cert\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.740112 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm66f\" (UniqueName: \"kubernetes.io/projected/a22babc4-c86d-4152-8113-84595c89b271-kube-api-access-pm66f\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.740274 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-config\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.740434 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-client-ca\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.741509 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-config\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.843884 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm66f\" (UniqueName: \"kubernetes.io/projected/a22babc4-c86d-4152-8113-84595c89b271-kube-api-access-pm66f\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.844274 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-client-ca\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.844919 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-client-ca\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.844992 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22babc4-c86d-4152-8113-84595c89b271-serving-cert\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.852058 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22babc4-c86d-4152-8113-84595c89b271-serving-cert\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.865501 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm66f\" (UniqueName: \"kubernetes.io/projected/a22babc4-c86d-4152-8113-84595c89b271-kube-api-access-pm66f\") pod \"route-controller-manager-569958d7fb-gtl9v\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:53 crc kubenswrapper[4722]: I0219 19:21:53.891899 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.150821 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.325133 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v"] Feb 19 19:21:54 crc kubenswrapper[4722]: W0219 19:21:54.336441 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda22babc4_c86d_4152_8113_84595c89b271.slice/crio-94db865038a857956c4d41e2860541b7152d327e207050b03c8c6a3a5482128d WatchSource:0}: Error finding container 94db865038a857956c4d41e2860541b7152d327e207050b03c8c6a3a5482128d: Status 404 returned error can't find the container with id 94db865038a857956c4d41e2860541b7152d327e207050b03c8c6a3a5482128d Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.350250 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-utilities\") pod \"c594681e-de0b-4b39-98d3-573c9170c898\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.350310 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhxq8\" (UniqueName: \"kubernetes.io/projected/c594681e-de0b-4b39-98d3-573c9170c898-kube-api-access-bhxq8\") pod \"c594681e-de0b-4b39-98d3-573c9170c898\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.350383 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-catalog-content\") pod \"c594681e-de0b-4b39-98d3-573c9170c898\" (UID: \"c594681e-de0b-4b39-98d3-573c9170c898\") " Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.353355 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-utilities" (OuterVolumeSpecName: "utilities") pod "c594681e-de0b-4b39-98d3-573c9170c898" (UID: "c594681e-de0b-4b39-98d3-573c9170c898"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.356501 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c594681e-de0b-4b39-98d3-573c9170c898-kube-api-access-bhxq8" (OuterVolumeSpecName: "kube-api-access-bhxq8") pod "c594681e-de0b-4b39-98d3-573c9170c898" (UID: "c594681e-de0b-4b39-98d3-573c9170c898"). InnerVolumeSpecName "kube-api-access-bhxq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.412439 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.416605 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c594681e-de0b-4b39-98d3-573c9170c898" (UID: "c594681e-de0b-4b39-98d3-573c9170c898"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.451754 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.451789 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhxq8\" (UniqueName: \"kubernetes.io/projected/c594681e-de0b-4b39-98d3-573c9170c898-kube-api-access-bhxq8\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.451800 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c594681e-de0b-4b39-98d3-573c9170c898-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.561401 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-catalog-content\") pod \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.561539 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vckt\" (UniqueName: \"kubernetes.io/projected/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-kube-api-access-6vckt\") pod \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.561566 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-utilities\") pod \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\" (UID: \"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe\") " Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.562707 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-utilities" (OuterVolumeSpecName: "utilities") pod "f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" (UID: "f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.564714 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p4576" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.564760 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p4576" event={"ID":"f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe","Type":"ContainerDied","Data":"ff5ad27012e651ea99b2c5454cf7b789a1c44ed2c936a800e67aa01d7e7683b4"} Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.567221 4722 scope.go:117] "RemoveContainer" containerID="59abb000514be8d9f59000c1e9c4b40a7fed4fed6d9e61216969f53d819ffdef" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.567833 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-kube-api-access-6vckt" (OuterVolumeSpecName: "kube-api-access-6vckt") pod "f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" (UID: "f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe"). InnerVolumeSpecName "kube-api-access-6vckt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.571872 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j86kw" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.571937 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j86kw" event={"ID":"c594681e-de0b-4b39-98d3-573c9170c898","Type":"ContainerDied","Data":"9b378dd4da61b5af99f5f93bba7c15d0d04355aa249d4e89b10b4d368ec3db4e"} Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.573866 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" event={"ID":"a22babc4-c86d-4152-8113-84595c89b271","Type":"ContainerStarted","Data":"94db865038a857956c4d41e2860541b7152d327e207050b03c8c6a3a5482128d"} Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.594412 4722 scope.go:117] "RemoveContainer" containerID="2aa095dd8f535949977c905c9b49fee93638ecf8347aa83cac60afa0f336cc86" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.608276 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j86kw"] Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.610822 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j86kw"] Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.628232 4722 scope.go:117] "RemoveContainer" containerID="d8152997987bda50dd12277fbfbc9da38a131bf85945cd167cb7db72d9b9372b" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.642273 4722 scope.go:117] "RemoveContainer" containerID="e842cd93c8af4269ffab8a136762be28cba28d7ca69bf398c59e72796317d60c" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.652074 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" (UID: "f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.658989 4722 scope.go:117] "RemoveContainer" containerID="66cd55d7e5fc27ab50c52a8a0d368159c8c115d8bef1d54037565d69fb207dbc" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.663109 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vckt\" (UniqueName: \"kubernetes.io/projected/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-kube-api-access-6vckt\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.663128 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.663137 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.678138 4722 scope.go:117] "RemoveContainer" containerID="83d63174a5dee0510e001a33beae280a6c56b7d09645762d8197fc6948f07c46" Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.895033 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p4576"] Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.897321 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p4576"] Feb 19 19:21:54 crc kubenswrapper[4722]: I0219 19:21:54.936802 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hg6kw"] Feb 19 19:21:55 crc kubenswrapper[4722]: I0219 19:21:55.079247 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c594681e-de0b-4b39-98d3-573c9170c898" path="/var/lib/kubelet/pods/c594681e-de0b-4b39-98d3-573c9170c898/volumes" Feb 19 19:21:55 crc kubenswrapper[4722]: I0219 19:21:55.079914 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" path="/var/lib/kubelet/pods/f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe/volumes" Feb 19 19:21:55 crc kubenswrapper[4722]: I0219 19:21:55.581335 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" event={"ID":"a22babc4-c86d-4152-8113-84595c89b271","Type":"ContainerStarted","Data":"3a4ef31c001b0ddd6e8c5f948ccd90a4ebad4e13bb8c11170070152df207a255"} Feb 19 19:21:55 crc kubenswrapper[4722]: I0219 19:21:55.581481 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hg6kw" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerName="registry-server" containerID="cri-o://8993afef8511380dc6814c043e53efeb7c1d8df71314aae95c262ecad6010502" gracePeriod=2 Feb 19 19:21:55 crc kubenswrapper[4722]: I0219 19:21:55.599628 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" podStartSLOduration=6.599609411 podStartE2EDuration="6.599609411s" podCreationTimestamp="2026-02-19 19:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:21:55.596127161 +0000 UTC m=+215.208477485" watchObservedRunningTime="2026-02-19 19:21:55.599609411 +0000 UTC m=+215.211959725" Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.590111 4722 generic.go:334] "Generic (PLEG): container finished" podID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerID="8993afef8511380dc6814c043e53efeb7c1d8df71314aae95c262ecad6010502" exitCode=0 Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.590141 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hg6kw" event={"ID":"7ad9ab6b-efbe-4d01-97b0-281ee8a199df","Type":"ContainerDied","Data":"8993afef8511380dc6814c043e53efeb7c1d8df71314aae95c262ecad6010502"} Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.590720 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.595626 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.647653 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.687644 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-utilities\") pod \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.687708 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-catalog-content\") pod \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.687781 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-665pv\" (UniqueName: \"kubernetes.io/projected/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-kube-api-access-665pv\") pod \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\" (UID: \"7ad9ab6b-efbe-4d01-97b0-281ee8a199df\") " Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.689941 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-utilities" (OuterVolumeSpecName: "utilities") pod "7ad9ab6b-efbe-4d01-97b0-281ee8a199df" (UID: "7ad9ab6b-efbe-4d01-97b0-281ee8a199df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.696947 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-kube-api-access-665pv" (OuterVolumeSpecName: "kube-api-access-665pv") pod "7ad9ab6b-efbe-4d01-97b0-281ee8a199df" (UID: "7ad9ab6b-efbe-4d01-97b0-281ee8a199df"). InnerVolumeSpecName "kube-api-access-665pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.718743 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ad9ab6b-efbe-4d01-97b0-281ee8a199df" (UID: "7ad9ab6b-efbe-4d01-97b0-281ee8a199df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.789601 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-665pv\" (UniqueName: \"kubernetes.io/projected/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-kube-api-access-665pv\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.789644 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:56 crc kubenswrapper[4722]: I0219 19:21:56.789657 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad9ab6b-efbe-4d01-97b0-281ee8a199df-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.341006 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4tk99"] Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.341385 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4tk99" podUID="12054322-fe1e-4205-b6d3-05b30024a987" containerName="registry-server" containerID="cri-o://2f3f9d21eca082120541810f90cc0e416e5125a66419fec7ce931b180b25c24e" gracePeriod=2 Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.598806 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hg6kw" event={"ID":"7ad9ab6b-efbe-4d01-97b0-281ee8a199df","Type":"ContainerDied","Data":"8dc5a71e303cb93058a38469bccf8ecf609733633925d9394dad473ed82bd95d"} Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.598882 4722 scope.go:117] "RemoveContainer" containerID="8993afef8511380dc6814c043e53efeb7c1d8df71314aae95c262ecad6010502" Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.599283 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hg6kw" Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.604453 4722 generic.go:334] "Generic (PLEG): container finished" podID="12054322-fe1e-4205-b6d3-05b30024a987" containerID="2f3f9d21eca082120541810f90cc0e416e5125a66419fec7ce931b180b25c24e" exitCode=0 Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.604558 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4tk99" event={"ID":"12054322-fe1e-4205-b6d3-05b30024a987","Type":"ContainerDied","Data":"2f3f9d21eca082120541810f90cc0e416e5125a66419fec7ce931b180b25c24e"} Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.629369 4722 scope.go:117] "RemoveContainer" containerID="d8aaa67a4ff9066de0c0fee741280169063042f7cb7d5dafb2624fc9902e5310" Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.640533 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hg6kw"] Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.646640 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hg6kw"] Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.674257 4722 scope.go:117] "RemoveContainer" containerID="0dd65a739e9f5e8ad490009cf2eebc6f6859f0fe25f4e418d1b7a49467014a17" Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.817466 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.903392 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-utilities\") pod \"12054322-fe1e-4205-b6d3-05b30024a987\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.903489 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-catalog-content\") pod \"12054322-fe1e-4205-b6d3-05b30024a987\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.903523 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b987n\" (UniqueName: \"kubernetes.io/projected/12054322-fe1e-4205-b6d3-05b30024a987-kube-api-access-b987n\") pod \"12054322-fe1e-4205-b6d3-05b30024a987\" (UID: \"12054322-fe1e-4205-b6d3-05b30024a987\") " Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.904656 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-utilities" (OuterVolumeSpecName: "utilities") pod "12054322-fe1e-4205-b6d3-05b30024a987" (UID: "12054322-fe1e-4205-b6d3-05b30024a987"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:21:57 crc kubenswrapper[4722]: I0219 19:21:57.910323 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12054322-fe1e-4205-b6d3-05b30024a987-kube-api-access-b987n" (OuterVolumeSpecName: "kube-api-access-b987n") pod "12054322-fe1e-4205-b6d3-05b30024a987" (UID: "12054322-fe1e-4205-b6d3-05b30024a987"). InnerVolumeSpecName "kube-api-access-b987n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.004685 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.004724 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b987n\" (UniqueName: \"kubernetes.io/projected/12054322-fe1e-4205-b6d3-05b30024a987-kube-api-access-b987n\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.041855 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12054322-fe1e-4205-b6d3-05b30024a987" (UID: "12054322-fe1e-4205-b6d3-05b30024a987"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.106148 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12054322-fe1e-4205-b6d3-05b30024a987-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.613602 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4tk99" event={"ID":"12054322-fe1e-4205-b6d3-05b30024a987","Type":"ContainerDied","Data":"8be595dce110543e9226c30bd0042ab6bce6646475f3656901ee019b32be514b"} Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.613660 4722 scope.go:117] "RemoveContainer" containerID="2f3f9d21eca082120541810f90cc0e416e5125a66419fec7ce931b180b25c24e" Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.613770 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4tk99" Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.633684 4722 scope.go:117] "RemoveContainer" containerID="ddd3ca27c25bd3be69324b8ea80fd859b1d4f9a489ef9ea86f39a650b78fd038" Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.649956 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4tk99"] Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.653767 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4tk99"] Feb 19 19:21:58 crc kubenswrapper[4722]: I0219 19:21:58.666339 4722 scope.go:117] "RemoveContainer" containerID="57d551ccacbc04d55c2cac5a3bb7ceb078d63f2d275222bd8c776cbc6fad014d" Feb 19 19:21:59 crc kubenswrapper[4722]: I0219 19:21:59.081933 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12054322-fe1e-4205-b6d3-05b30024a987" path="/var/lib/kubelet/pods/12054322-fe1e-4205-b6d3-05b30024a987/volumes" Feb 19 19:21:59 crc kubenswrapper[4722]: I0219 19:21:59.082647 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" path="/var/lib/kubelet/pods/7ad9ab6b-efbe-4d01-97b0-281ee8a199df/volumes" Feb 19 19:22:01 crc kubenswrapper[4722]: I0219 19:22:01.382396 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:22:01 crc kubenswrapper[4722]: I0219 19:22:01.389411 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:22:02 crc kubenswrapper[4722]: I0219 19:22:02.528536 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ndzb8"] Feb 19 19:22:09 crc kubenswrapper[4722]: I0219 19:22:09.412015 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c79b9864-zpjrn"] Feb 19 19:22:09 crc kubenswrapper[4722]: I0219 19:22:09.412571 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" podUID="0b05f432-113b-41f2-8c75-ec167057d648" containerName="controller-manager" containerID="cri-o://125ae16b1eac7eba451bf3116e5570f9b87ca26d1938c5f7d79a5305a7cb39d5" gracePeriod=30 Feb 19 19:22:09 crc kubenswrapper[4722]: I0219 19:22:09.437998 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v"] Feb 19 19:22:09 crc kubenswrapper[4722]: I0219 19:22:09.438274 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" podUID="a22babc4-c86d-4152-8113-84595c89b271" containerName="route-controller-manager" containerID="cri-o://3a4ef31c001b0ddd6e8c5f948ccd90a4ebad4e13bb8c11170070152df207a255" gracePeriod=30 Feb 19 19:22:09 crc kubenswrapper[4722]: I0219 19:22:09.691957 4722 generic.go:334] "Generic (PLEG): container finished" podID="0b05f432-113b-41f2-8c75-ec167057d648" containerID="125ae16b1eac7eba451bf3116e5570f9b87ca26d1938c5f7d79a5305a7cb39d5" exitCode=0 Feb 19 19:22:09 crc kubenswrapper[4722]: I0219 19:22:09.692007 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" event={"ID":"0b05f432-113b-41f2-8c75-ec167057d648","Type":"ContainerDied","Data":"125ae16b1eac7eba451bf3116e5570f9b87ca26d1938c5f7d79a5305a7cb39d5"} Feb 19 19:22:09 crc kubenswrapper[4722]: I0219 19:22:09.693611 4722 generic.go:334] "Generic (PLEG): container finished" podID="a22babc4-c86d-4152-8113-84595c89b271" containerID="3a4ef31c001b0ddd6e8c5f948ccd90a4ebad4e13bb8c11170070152df207a255" exitCode=0 Feb 19 19:22:09 crc kubenswrapper[4722]: I0219 19:22:09.693646 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" event={"ID":"a22babc4-c86d-4152-8113-84595c89b271","Type":"ContainerDied","Data":"3a4ef31c001b0ddd6e8c5f948ccd90a4ebad4e13bb8c11170070152df207a255"} Feb 19 19:22:09 crc kubenswrapper[4722]: I0219 19:22:09.887009 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.055382 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm66f\" (UniqueName: \"kubernetes.io/projected/a22babc4-c86d-4152-8113-84595c89b271-kube-api-access-pm66f\") pod \"a22babc4-c86d-4152-8113-84595c89b271\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.055436 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22babc4-c86d-4152-8113-84595c89b271-serving-cert\") pod \"a22babc4-c86d-4152-8113-84595c89b271\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.055488 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-client-ca\") pod \"a22babc4-c86d-4152-8113-84595c89b271\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.055512 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-config\") pod \"a22babc4-c86d-4152-8113-84595c89b271\" (UID: \"a22babc4-c86d-4152-8113-84595c89b271\") " Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.056315 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-client-ca" (OuterVolumeSpecName: "client-ca") pod "a22babc4-c86d-4152-8113-84595c89b271" (UID: "a22babc4-c86d-4152-8113-84595c89b271"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.056372 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-config" (OuterVolumeSpecName: "config") pod "a22babc4-c86d-4152-8113-84595c89b271" (UID: "a22babc4-c86d-4152-8113-84595c89b271"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.066184 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a22babc4-c86d-4152-8113-84595c89b271-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a22babc4-c86d-4152-8113-84595c89b271" (UID: "a22babc4-c86d-4152-8113-84595c89b271"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.067439 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a22babc4-c86d-4152-8113-84595c89b271-kube-api-access-pm66f" (OuterVolumeSpecName: "kube-api-access-pm66f") pod "a22babc4-c86d-4152-8113-84595c89b271" (UID: "a22babc4-c86d-4152-8113-84595c89b271"). InnerVolumeSpecName "kube-api-access-pm66f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.157010 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm66f\" (UniqueName: \"kubernetes.io/projected/a22babc4-c86d-4152-8113-84595c89b271-kube-api-access-pm66f\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.157040 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a22babc4-c86d-4152-8113-84595c89b271-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.157051 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.157079 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a22babc4-c86d-4152-8113-84595c89b271-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.468446 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.561784 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b05f432-113b-41f2-8c75-ec167057d648-serving-cert\") pod \"0b05f432-113b-41f2-8c75-ec167057d648\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.562091 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-client-ca\") pod \"0b05f432-113b-41f2-8c75-ec167057d648\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.562125 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-proxy-ca-bundles\") pod \"0b05f432-113b-41f2-8c75-ec167057d648\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.562195 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-config\") pod \"0b05f432-113b-41f2-8c75-ec167057d648\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.562230 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7wvb\" (UniqueName: \"kubernetes.io/projected/0b05f432-113b-41f2-8c75-ec167057d648-kube-api-access-x7wvb\") pod \"0b05f432-113b-41f2-8c75-ec167057d648\" (UID: \"0b05f432-113b-41f2-8c75-ec167057d648\") " Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.563082 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-client-ca" (OuterVolumeSpecName: "client-ca") pod "0b05f432-113b-41f2-8c75-ec167057d648" (UID: "0b05f432-113b-41f2-8c75-ec167057d648"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.563117 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0b05f432-113b-41f2-8c75-ec167057d648" (UID: "0b05f432-113b-41f2-8c75-ec167057d648"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.563189 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-config" (OuterVolumeSpecName: "config") pod "0b05f432-113b-41f2-8c75-ec167057d648" (UID: "0b05f432-113b-41f2-8c75-ec167057d648"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.565632 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b05f432-113b-41f2-8c75-ec167057d648-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b05f432-113b-41f2-8c75-ec167057d648" (UID: "0b05f432-113b-41f2-8c75-ec167057d648"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.565890 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b05f432-113b-41f2-8c75-ec167057d648-kube-api-access-x7wvb" (OuterVolumeSpecName: "kube-api-access-x7wvb") pod "0b05f432-113b-41f2-8c75-ec167057d648" (UID: "0b05f432-113b-41f2-8c75-ec167057d648"). InnerVolumeSpecName "kube-api-access-x7wvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.586774 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k"] Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587130 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerName="extract-utilities" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587183 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerName="extract-utilities" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587206 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c594681e-de0b-4b39-98d3-573c9170c898" containerName="extract-utilities" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587219 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c594681e-de0b-4b39-98d3-573c9170c898" containerName="extract-utilities" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587232 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerName="extract-content" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587243 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerName="extract-content" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587256 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c594681e-de0b-4b39-98d3-573c9170c898" containerName="extract-content" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587268 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c594681e-de0b-4b39-98d3-573c9170c898" containerName="extract-content" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587291 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12054322-fe1e-4205-b6d3-05b30024a987" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587304 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="12054322-fe1e-4205-b6d3-05b30024a987" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587321 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b05f432-113b-41f2-8c75-ec167057d648" containerName="controller-manager" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587333 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b05f432-113b-41f2-8c75-ec167057d648" containerName="controller-manager" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587350 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerName="extract-utilities" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587360 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerName="extract-utilities" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587373 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c594681e-de0b-4b39-98d3-573c9170c898" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587383 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c594681e-de0b-4b39-98d3-573c9170c898" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587397 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587408 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587462 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12054322-fe1e-4205-b6d3-05b30024a987" containerName="extract-utilities" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587474 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="12054322-fe1e-4205-b6d3-05b30024a987" containerName="extract-utilities" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587490 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12054322-fe1e-4205-b6d3-05b30024a987" containerName="extract-content" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587502 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="12054322-fe1e-4205-b6d3-05b30024a987" containerName="extract-content" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587517 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerName="extract-content" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587528 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerName="extract-content" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587539 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a22babc4-c86d-4152-8113-84595c89b271" containerName="route-controller-manager" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587550 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a22babc4-c86d-4152-8113-84595c89b271" containerName="route-controller-manager" Feb 19 19:22:10 crc kubenswrapper[4722]: E0219 19:22:10.587564 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587575 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587750 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b05f432-113b-41f2-8c75-ec167057d648" containerName="controller-manager" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587775 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3947bdb-5e5a-43c8-b23d-d5aa97ebaebe" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587795 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a22babc4-c86d-4152-8113-84595c89b271" containerName="route-controller-manager" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587809 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ad9ab6b-efbe-4d01-97b0-281ee8a199df" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587820 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="12054322-fe1e-4205-b6d3-05b30024a987" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.587840 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c594681e-de0b-4b39-98d3-573c9170c898" containerName="registry-server" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.588490 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.594592 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c"] Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.595733 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.599138 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c"] Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.602406 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k"] Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.664212 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b05f432-113b-41f2-8c75-ec167057d648-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.664249 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.664260 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.664271 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b05f432-113b-41f2-8c75-ec167057d648-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.664283 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7wvb\" (UniqueName: \"kubernetes.io/projected/0b05f432-113b-41f2-8c75-ec167057d648-kube-api-access-x7wvb\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.701063 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" event={"ID":"a22babc4-c86d-4152-8113-84595c89b271","Type":"ContainerDied","Data":"94db865038a857956c4d41e2860541b7152d327e207050b03c8c6a3a5482128d"} Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.701107 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.701137 4722 scope.go:117] "RemoveContainer" containerID="3a4ef31c001b0ddd6e8c5f948ccd90a4ebad4e13bb8c11170070152df207a255" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.702536 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" event={"ID":"0b05f432-113b-41f2-8c75-ec167057d648","Type":"ContainerDied","Data":"9c530f54177f0ff087e91411299d1317bfd8535f2cd89ca09d2b6c3da2e9b3cc"} Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.702646 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c79b9864-zpjrn" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.721190 4722 scope.go:117] "RemoveContainer" containerID="125ae16b1eac7eba451bf3116e5570f9b87ca26d1938c5f7d79a5305a7cb39d5" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.736171 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c79b9864-zpjrn"] Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.741719 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c79b9864-zpjrn"] Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.745529 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v"] Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.748775 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-569958d7fb-gtl9v"] Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.765136 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dad06c7c-a6ab-40f3-860c-87def86419fd-config\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.765204 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/053ce374-dacc-4077-a873-22ff300b8c46-config\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.765226 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f2cw\" (UniqueName: \"kubernetes.io/projected/dad06c7c-a6ab-40f3-860c-87def86419fd-kube-api-access-9f2cw\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.765252 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/053ce374-dacc-4077-a873-22ff300b8c46-client-ca\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.765268 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95c29\" (UniqueName: \"kubernetes.io/projected/053ce374-dacc-4077-a873-22ff300b8c46-kube-api-access-95c29\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.765292 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dad06c7c-a6ab-40f3-860c-87def86419fd-proxy-ca-bundles\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.765407 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/053ce374-dacc-4077-a873-22ff300b8c46-serving-cert\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.765428 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dad06c7c-a6ab-40f3-860c-87def86419fd-client-ca\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.765448 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dad06c7c-a6ab-40f3-860c-87def86419fd-serving-cert\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.866939 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/053ce374-dacc-4077-a873-22ff300b8c46-serving-cert\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.867019 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dad06c7c-a6ab-40f3-860c-87def86419fd-client-ca\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.867056 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dad06c7c-a6ab-40f3-860c-87def86419fd-serving-cert\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.867188 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dad06c7c-a6ab-40f3-860c-87def86419fd-config\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.867268 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/053ce374-dacc-4077-a873-22ff300b8c46-config\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.867308 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f2cw\" (UniqueName: \"kubernetes.io/projected/dad06c7c-a6ab-40f3-860c-87def86419fd-kube-api-access-9f2cw\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.867354 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/053ce374-dacc-4077-a873-22ff300b8c46-client-ca\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.867388 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95c29\" (UniqueName: \"kubernetes.io/projected/053ce374-dacc-4077-a873-22ff300b8c46-kube-api-access-95c29\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.867435 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dad06c7c-a6ab-40f3-860c-87def86419fd-proxy-ca-bundles\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.868806 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/053ce374-dacc-4077-a873-22ff300b8c46-client-ca\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.869322 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dad06c7c-a6ab-40f3-860c-87def86419fd-proxy-ca-bundles\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.869471 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dad06c7c-a6ab-40f3-860c-87def86419fd-client-ca\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.870134 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/053ce374-dacc-4077-a873-22ff300b8c46-config\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.870703 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dad06c7c-a6ab-40f3-860c-87def86419fd-config\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.874135 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dad06c7c-a6ab-40f3-860c-87def86419fd-serving-cert\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.875700 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/053ce374-dacc-4077-a873-22ff300b8c46-serving-cert\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.888637 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95c29\" (UniqueName: \"kubernetes.io/projected/053ce374-dacc-4077-a873-22ff300b8c46-kube-api-access-95c29\") pod \"route-controller-manager-58d754894f-88p4k\" (UID: \"053ce374-dacc-4077-a873-22ff300b8c46\") " pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.896729 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f2cw\" (UniqueName: \"kubernetes.io/projected/dad06c7c-a6ab-40f3-860c-87def86419fd-kube-api-access-9f2cw\") pod \"controller-manager-5d9d79fcc9-lm94c\" (UID: \"dad06c7c-a6ab-40f3-860c-87def86419fd\") " pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.938423 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:10 crc kubenswrapper[4722]: I0219 19:22:10.944678 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.080080 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b05f432-113b-41f2-8c75-ec167057d648" path="/var/lib/kubelet/pods/0b05f432-113b-41f2-8c75-ec167057d648/volumes" Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.080768 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a22babc4-c86d-4152-8113-84595c89b271" path="/var/lib/kubelet/pods/a22babc4-c86d-4152-8113-84595c89b271/volumes" Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.396676 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k"] Feb 19 19:22:11 crc kubenswrapper[4722]: W0219 19:22:11.407945 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod053ce374_dacc_4077_a873_22ff300b8c46.slice/crio-33917cd2955b126a42aad36507667ed4122414f6dc11257692795f5ffd483f06 WatchSource:0}: Error finding container 33917cd2955b126a42aad36507667ed4122414f6dc11257692795f5ffd483f06: Status 404 returned error can't find the container with id 33917cd2955b126a42aad36507667ed4122414f6dc11257692795f5ffd483f06 Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.444655 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c"] Feb 19 19:22:11 crc kubenswrapper[4722]: W0219 19:22:11.450407 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddad06c7c_a6ab_40f3_860c_87def86419fd.slice/crio-24737625b1f8656bde7e1459d0346303e4a273707988c0fb7a685d4dbc060806 WatchSource:0}: Error finding container 24737625b1f8656bde7e1459d0346303e4a273707988c0fb7a685d4dbc060806: Status 404 returned error can't find the container with id 24737625b1f8656bde7e1459d0346303e4a273707988c0fb7a685d4dbc060806 Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.708848 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" event={"ID":"053ce374-dacc-4077-a873-22ff300b8c46","Type":"ContainerStarted","Data":"48d6446db2fb5432fc773ac6bf2aa0912885ad6a44d6c26f955f6d0843d51080"} Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.708900 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" event={"ID":"053ce374-dacc-4077-a873-22ff300b8c46","Type":"ContainerStarted","Data":"33917cd2955b126a42aad36507667ed4122414f6dc11257692795f5ffd483f06"} Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.709060 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.710125 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" event={"ID":"dad06c7c-a6ab-40f3-860c-87def86419fd","Type":"ContainerStarted","Data":"b9621a4da94da727c914810f827019ed3e57eb5725fea68ed16eb893370a0736"} Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.710194 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" event={"ID":"dad06c7c-a6ab-40f3-860c-87def86419fd","Type":"ContainerStarted","Data":"24737625b1f8656bde7e1459d0346303e4a273707988c0fb7a685d4dbc060806"} Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.710382 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.715125 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.730865 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" podStartSLOduration=2.730839022 podStartE2EDuration="2.730839022s" podCreationTimestamp="2026-02-19 19:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:22:11.727246898 +0000 UTC m=+231.339597222" watchObservedRunningTime="2026-02-19 19:22:11.730839022 +0000 UTC m=+231.343189366" Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.756033 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d9d79fcc9-lm94c" podStartSLOduration=2.756003859 podStartE2EDuration="2.756003859s" podCreationTimestamp="2026-02-19 19:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:22:11.750537676 +0000 UTC m=+231.362888000" watchObservedRunningTime="2026-02-19 19:22:11.756003859 +0000 UTC m=+231.368354193" Feb 19 19:22:11 crc kubenswrapper[4722]: I0219 19:22:11.894942 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-58d754894f-88p4k" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.398257 4722 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.399195 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a" gracePeriod=15 Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.399295 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003" gracePeriod=15 Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.399348 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb" gracePeriod=15 Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.399391 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae" gracePeriod=15 Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.399571 4722 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.399597 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae" gracePeriod=15 Feb 19 19:22:15 crc kubenswrapper[4722]: E0219 19:22:15.399912 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.399932 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 19:22:15 crc kubenswrapper[4722]: E0219 19:22:15.399950 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.399965 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 19:22:15 crc kubenswrapper[4722]: E0219 19:22:15.399980 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.399992 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 19:22:15 crc kubenswrapper[4722]: E0219 19:22:15.400019 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.400032 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 19 19:22:15 crc kubenswrapper[4722]: E0219 19:22:15.400049 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.400061 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 19:22:15 crc kubenswrapper[4722]: E0219 19:22:15.400082 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.400094 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 19:22:15 crc kubenswrapper[4722]: E0219 19:22:15.400109 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.400122 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.400299 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.400321 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.400334 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.400351 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.400369 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.400394 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.402801 4722 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.403261 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.408363 4722 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.440624 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.531799 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.531847 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.531877 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.531918 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.531940 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.532033 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.532195 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.532311 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633379 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633463 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633500 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633527 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633571 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633551 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633604 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633640 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633595 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633676 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633698 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633722 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.633906 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.634019 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.634075 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.634139 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.734654 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.747868 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.749682 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.750512 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae" exitCode=0 Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.750557 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae" exitCode=0 Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.750568 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003" exitCode=0 Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.750575 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb" exitCode=2 Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.750635 4722 scope.go:117] "RemoveContainer" containerID="e837a6e89d9ad5ad97a2a4f4c4775ae8a409f07c04b920550363b2f001e06012" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.753820 4722 generic.go:334] "Generic (PLEG): container finished" podID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" containerID="1f68a7c9928e93107f9848c5151b976a7aa149617e7e965be09dba7a86508ed6" exitCode=0 Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.753889 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d98fac92-53aa-469c-b47e-4cc6edd91ef7","Type":"ContainerDied","Data":"1f68a7c9928e93107f9848c5151b976a7aa149617e7e965be09dba7a86508ed6"} Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.755107 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:15 crc kubenswrapper[4722]: I0219 19:22:15.756349 4722 status_manager.go:851] "Failed to get status for pod" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:15 crc kubenswrapper[4722]: W0219 19:22:15.770081 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-6dcac553ee6e292f97146dd172989f7b63e53aaeb996f52139c602135aa7964a WatchSource:0}: Error finding container 6dcac553ee6e292f97146dd172989f7b63e53aaeb996f52139c602135aa7964a: Status 404 returned error can't find the container with id 6dcac553ee6e292f97146dd172989f7b63e53aaeb996f52139c602135aa7964a Feb 19 19:22:15 crc kubenswrapper[4722]: E0219 19:22:15.773335 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.195:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895bc2e5e0bf27d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 19:22:15.772516989 +0000 UTC m=+235.384867333,LastTimestamp:2026-02-19 19:22:15.772516989 +0000 UTC m=+235.384867333,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 19:22:16 crc kubenswrapper[4722]: I0219 19:22:16.763409 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 19:22:16 crc kubenswrapper[4722]: I0219 19:22:16.766930 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103"} Feb 19 19:22:16 crc kubenswrapper[4722]: I0219 19:22:16.767012 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6dcac553ee6e292f97146dd172989f7b63e53aaeb996f52139c602135aa7964a"} Feb 19 19:22:16 crc kubenswrapper[4722]: I0219 19:22:16.768320 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:16 crc kubenswrapper[4722]: I0219 19:22:16.768753 4722 status_manager.go:851] "Failed to get status for pod" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.145183 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.146235 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.146625 4722 status_manager.go:851] "Failed to get status for pod" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.254931 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kubelet-dir\") pod \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.255074 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kube-api-access\") pod \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.255113 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-var-lock\") pod \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\" (UID: \"d98fac92-53aa-469c-b47e-4cc6edd91ef7\") " Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.255182 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d98fac92-53aa-469c-b47e-4cc6edd91ef7" (UID: "d98fac92-53aa-469c-b47e-4cc6edd91ef7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.255314 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-var-lock" (OuterVolumeSpecName: "var-lock") pod "d98fac92-53aa-469c-b47e-4cc6edd91ef7" (UID: "d98fac92-53aa-469c-b47e-4cc6edd91ef7"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.256002 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.256047 4722 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d98fac92-53aa-469c-b47e-4cc6edd91ef7-var-lock\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.261333 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d98fac92-53aa-469c-b47e-4cc6edd91ef7" (UID: "d98fac92-53aa-469c-b47e-4cc6edd91ef7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.357268 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d98fac92-53aa-469c-b47e-4cc6edd91ef7-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.771332 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.772556 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.772928 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.772993 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.773218 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.773393 4722 status_manager.go:851] "Failed to get status for pod" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.773756 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a" exitCode=0 Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.773810 4722 scope.go:117] "RemoveContainer" containerID="0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.774890 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d98fac92-53aa-469c-b47e-4cc6edd91ef7","Type":"ContainerDied","Data":"be8a061b191347417c7eff0e39c1d45a40ce52746371e25938f78f0f9a4f9e58"} Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.774912 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be8a061b191347417c7eff0e39c1d45a40ce52746371e25938f78f0f9a4f9e58" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.774930 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.787940 4722 scope.go:117] "RemoveContainer" containerID="299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.789682 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.789914 4722 status_manager.go:851] "Failed to get status for pod" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.790162 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.804699 4722 scope.go:117] "RemoveContainer" containerID="b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.817226 4722 scope.go:117] "RemoveContainer" containerID="c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.831937 4722 scope.go:117] "RemoveContainer" containerID="e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.850264 4722 scope.go:117] "RemoveContainer" containerID="6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.863563 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.863680 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.863668 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.863697 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.863768 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.863747 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.864185 4722 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.864207 4722 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.864220 4722 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.864392 4722 scope.go:117] "RemoveContainer" containerID="0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae" Feb 19 19:22:17 crc kubenswrapper[4722]: E0219 19:22:17.864978 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\": container with ID starting with 0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae not found: ID does not exist" containerID="0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.865004 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae"} err="failed to get container status \"0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\": rpc error: code = NotFound desc = could not find container \"0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae\": container with ID starting with 0dc5e940701a579a4f776b6a383ec58271c60e2a42434bda4c0e4dfca044f8ae not found: ID does not exist" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.865023 4722 scope.go:117] "RemoveContainer" containerID="299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae" Feb 19 19:22:17 crc kubenswrapper[4722]: E0219 19:22:17.865429 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\": container with ID starting with 299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae not found: ID does not exist" containerID="299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.865454 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae"} err="failed to get container status \"299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\": rpc error: code = NotFound desc = could not find container \"299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae\": container with ID starting with 299d1afae71b42d7cf0b6f5d2360ae7f434188f64505cb7dad7340af931f54ae not found: ID does not exist" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.865467 4722 scope.go:117] "RemoveContainer" containerID="b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003" Feb 19 19:22:17 crc kubenswrapper[4722]: E0219 19:22:17.865746 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\": container with ID starting with b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003 not found: ID does not exist" containerID="b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.865764 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003"} err="failed to get container status \"b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\": rpc error: code = NotFound desc = could not find container \"b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003\": container with ID starting with b2350c19482f630fc5cf7d4a363d81837e3e18ac730d195f8c1e45e5fd3c8003 not found: ID does not exist" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.865778 4722 scope.go:117] "RemoveContainer" containerID="c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb" Feb 19 19:22:17 crc kubenswrapper[4722]: E0219 19:22:17.866021 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\": container with ID starting with c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb not found: ID does not exist" containerID="c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.866042 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb"} err="failed to get container status \"c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\": rpc error: code = NotFound desc = could not find container \"c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb\": container with ID starting with c937cc08d1035b4279f2ff752ac8015d23c8f10d5063f24cc7dd39b925db09cb not found: ID does not exist" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.866056 4722 scope.go:117] "RemoveContainer" containerID="e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a" Feb 19 19:22:17 crc kubenswrapper[4722]: E0219 19:22:17.866288 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\": container with ID starting with e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a not found: ID does not exist" containerID="e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.866308 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a"} err="failed to get container status \"e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\": rpc error: code = NotFound desc = could not find container \"e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a\": container with ID starting with e1d7ad502ca5735b1cf523d85038a177f171f021a49ec8a7791da4e6b05fc57a not found: ID does not exist" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.866320 4722 scope.go:117] "RemoveContainer" containerID="6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d" Feb 19 19:22:17 crc kubenswrapper[4722]: E0219 19:22:17.866501 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\": container with ID starting with 6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d not found: ID does not exist" containerID="6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d" Feb 19 19:22:17 crc kubenswrapper[4722]: I0219 19:22:17.866522 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d"} err="failed to get container status \"6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\": rpc error: code = NotFound desc = could not find container \"6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d\": container with ID starting with 6429139651a53a6cf7701834d01c37c06f3b6a46bbe7746b4c27129b8348ab3d not found: ID does not exist" Feb 19 19:22:18 crc kubenswrapper[4722]: I0219 19:22:18.781278 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:18 crc kubenswrapper[4722]: I0219 19:22:18.796572 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:18 crc kubenswrapper[4722]: I0219 19:22:18.796875 4722 status_manager.go:851] "Failed to get status for pod" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:18 crc kubenswrapper[4722]: I0219 19:22:18.797064 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:19 crc kubenswrapper[4722]: I0219 19:22:19.077737 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 19 19:22:19 crc kubenswrapper[4722]: E0219 19:22:19.634736 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.195:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895bc2e5e0bf27d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 19:22:15.772516989 +0000 UTC m=+235.384867333,LastTimestamp:2026-02-19 19:22:15.772516989 +0000 UTC m=+235.384867333,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 19:22:21 crc kubenswrapper[4722]: I0219 19:22:21.079225 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:21 crc kubenswrapper[4722]: I0219 19:22:21.079438 4722 status_manager.go:851] "Failed to get status for pod" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:21 crc kubenswrapper[4722]: E0219 19:22:21.935559 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:21 crc kubenswrapper[4722]: E0219 19:22:21.936474 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:21 crc kubenswrapper[4722]: E0219 19:22:21.937028 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:21 crc kubenswrapper[4722]: E0219 19:22:21.937725 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:21 crc kubenswrapper[4722]: E0219 19:22:21.938130 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:21 crc kubenswrapper[4722]: I0219 19:22:21.938172 4722 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 19 19:22:21 crc kubenswrapper[4722]: E0219 19:22:21.938459 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="200ms" Feb 19 19:22:22 crc kubenswrapper[4722]: E0219 19:22:22.140298 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="400ms" Feb 19 19:22:22 crc kubenswrapper[4722]: E0219 19:22:22.541537 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="800ms" Feb 19 19:22:23 crc kubenswrapper[4722]: E0219 19:22:23.342745 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="1.6s" Feb 19 19:22:24 crc kubenswrapper[4722]: E0219 19:22:24.944341 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.195:6443: connect: connection refused" interval="3.2s" Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.071273 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.072303 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.072654 4722 status_manager.go:851] "Failed to get status for pod" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.093043 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3974ea1e-a55a-4504-aec2-f9aab56fd6da" Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.093077 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3974ea1e-a55a-4504-aec2-f9aab56fd6da" Feb 19 19:22:26 crc kubenswrapper[4722]: E0219 19:22:26.093441 4722 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.093888 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:26 crc kubenswrapper[4722]: W0219 19:22:26.135690 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-8ce8ac447730ece54d241ef48379bb1d7ae043814cc3114ed05fd22610703671 WatchSource:0}: Error finding container 8ce8ac447730ece54d241ef48379bb1d7ae043814cc3114ed05fd22610703671: Status 404 returned error can't find the container with id 8ce8ac447730ece54d241ef48379bb1d7ae043814cc3114ed05fd22610703671 Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.835207 4722 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="0592a4af60ae44c6f38001780aebfe314c6fffcbec293bd9f92aa6cfc99936f9" exitCode=0 Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.835330 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"0592a4af60ae44c6f38001780aebfe314c6fffcbec293bd9f92aa6cfc99936f9"} Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.835939 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8ce8ac447730ece54d241ef48379bb1d7ae043814cc3114ed05fd22610703671"} Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.836362 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3974ea1e-a55a-4504-aec2-f9aab56fd6da" Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.836466 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3974ea1e-a55a-4504-aec2-f9aab56fd6da" Feb 19 19:22:26 crc kubenswrapper[4722]: E0219 19:22:26.837124 4722 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.837573 4722 status_manager.go:851] "Failed to get status for pod" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:26 crc kubenswrapper[4722]: I0219 19:22:26.838105 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.195:6443: connect: connection refused" Feb 19 19:22:27 crc kubenswrapper[4722]: I0219 19:22:27.554645 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" podUID="ecc880c8-beb9-4081-8af6-64d2fa857901" containerName="oauth-openshift" containerID="cri-o://025da636ca5ec87dbbbe0099c0cb554b53402034ea5236acbe0c2f2324b80d4e" gracePeriod=15 Feb 19 19:22:27 crc kubenswrapper[4722]: I0219 19:22:27.848402 4722 generic.go:334] "Generic (PLEG): container finished" podID="ecc880c8-beb9-4081-8af6-64d2fa857901" containerID="025da636ca5ec87dbbbe0099c0cb554b53402034ea5236acbe0c2f2324b80d4e" exitCode=0 Feb 19 19:22:27 crc kubenswrapper[4722]: I0219 19:22:27.849277 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" event={"ID":"ecc880c8-beb9-4081-8af6-64d2fa857901","Type":"ContainerDied","Data":"025da636ca5ec87dbbbe0099c0cb554b53402034ea5236acbe0c2f2324b80d4e"} Feb 19 19:22:27 crc kubenswrapper[4722]: I0219 19:22:27.867078 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"216074f71ec1f1180401fdc600d91e1d4aa94547ec510e042aba30fce443a118"} Feb 19 19:22:27 crc kubenswrapper[4722]: I0219 19:22:27.867113 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1f75f0329981877025ef171683d0e8b983576e28f3b7448907e7c35bd6b37efc"} Feb 19 19:22:27 crc kubenswrapper[4722]: I0219 19:22:27.867121 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4ed6ef704a2ba61c71f9580c8186d27acaf49ec150b28d5e968224a0f9f4b14c"} Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.067737 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208454 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-serving-cert\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208530 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-trusted-ca-bundle\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208589 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-provider-selection\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208647 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-policies\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208681 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-error\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208720 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-cliconfig\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208777 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-router-certs\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208811 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-dir\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208859 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-service-ca\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208902 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-session\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208935 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjd4m\" (UniqueName: \"kubernetes.io/projected/ecc880c8-beb9-4081-8af6-64d2fa857901-kube-api-access-qjd4m\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.208979 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-login\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.209013 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-ocp-branding-template\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.209084 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-idp-0-file-data\") pod \"ecc880c8-beb9-4081-8af6-64d2fa857901\" (UID: \"ecc880c8-beb9-4081-8af6-64d2fa857901\") " Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.209508 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.210475 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.210745 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.211047 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.212686 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.215108 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.215565 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.215999 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecc880c8-beb9-4081-8af6-64d2fa857901-kube-api-access-qjd4m" (OuterVolumeSpecName: "kube-api-access-qjd4m") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "kube-api-access-qjd4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.216395 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.217035 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.217314 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.217543 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.217768 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.228641 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "ecc880c8-beb9-4081-8af6-64d2fa857901" (UID: "ecc880c8-beb9-4081-8af6-64d2fa857901"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310572 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310603 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310612 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310621 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310630 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310640 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310652 4722 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310660 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310671 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310679 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310687 4722 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ecc880c8-beb9-4081-8af6-64d2fa857901-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310695 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310716 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ecc880c8-beb9-4081-8af6-64d2fa857901-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.310724 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjd4m\" (UniqueName: \"kubernetes.io/projected/ecc880c8-beb9-4081-8af6-64d2fa857901-kube-api-access-qjd4m\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.882247 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" event={"ID":"ecc880c8-beb9-4081-8af6-64d2fa857901","Type":"ContainerDied","Data":"e844f80f4659e52890f34ecd1020791a32cbf271dac55e2d79171097c0004545"} Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.882299 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ndzb8" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.882310 4722 scope.go:117] "RemoveContainer" containerID="025da636ca5ec87dbbbe0099c0cb554b53402034ea5236acbe0c2f2324b80d4e" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.884978 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"08f3fed92446bcc87c06e72bb6a1f3f2ebcf3ec1329c968e778029b00a0dae40"} Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.885014 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7ae6cb64355eb99531448a0ad974367a29a320a1b0434b7d243ade2b753bfd96"} Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.885343 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3974ea1e-a55a-4504-aec2-f9aab56fd6da" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.885366 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3974ea1e-a55a-4504-aec2-f9aab56fd6da" Feb 19 19:22:28 crc kubenswrapper[4722]: I0219 19:22:28.885567 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:30 crc kubenswrapper[4722]: I0219 19:22:30.904040 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 19 19:22:30 crc kubenswrapper[4722]: I0219 19:22:30.904440 4722 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c" exitCode=1 Feb 19 19:22:30 crc kubenswrapper[4722]: I0219 19:22:30.904497 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c"} Feb 19 19:22:30 crc kubenswrapper[4722]: I0219 19:22:30.905382 4722 scope.go:117] "RemoveContainer" containerID="985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c" Feb 19 19:22:30 crc kubenswrapper[4722]: I0219 19:22:30.992730 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:22:31 crc kubenswrapper[4722]: I0219 19:22:31.095298 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:31 crc kubenswrapper[4722]: I0219 19:22:31.095379 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:31 crc kubenswrapper[4722]: I0219 19:22:31.100887 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:31 crc kubenswrapper[4722]: I0219 19:22:31.915511 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 19 19:22:31 crc kubenswrapper[4722]: I0219 19:22:31.915950 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"93047d095dae322ea99676114c0cf5e81fab8c46f8e890c45706dc12f908b329"} Feb 19 19:22:31 crc kubenswrapper[4722]: I0219 19:22:31.964718 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:22:33 crc kubenswrapper[4722]: I0219 19:22:33.901190 4722 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:33 crc kubenswrapper[4722]: I0219 19:22:33.931754 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3974ea1e-a55a-4504-aec2-f9aab56fd6da" Feb 19 19:22:33 crc kubenswrapper[4722]: I0219 19:22:33.931812 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3974ea1e-a55a-4504-aec2-f9aab56fd6da" Feb 19 19:22:33 crc kubenswrapper[4722]: I0219 19:22:33.936937 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:33 crc kubenswrapper[4722]: I0219 19:22:33.939550 4722 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="68634038-94c9-4bb4-a05e-3d47197c3f1e" Feb 19 19:22:34 crc kubenswrapper[4722]: I0219 19:22:34.937213 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3974ea1e-a55a-4504-aec2-f9aab56fd6da" Feb 19 19:22:34 crc kubenswrapper[4722]: I0219 19:22:34.937267 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3974ea1e-a55a-4504-aec2-f9aab56fd6da" Feb 19 19:22:40 crc kubenswrapper[4722]: I0219 19:22:40.580808 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:22:40 crc kubenswrapper[4722]: I0219 19:22:40.582806 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 19 19:22:40 crc kubenswrapper[4722]: I0219 19:22:40.582906 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 19 19:22:40 crc kubenswrapper[4722]: I0219 19:22:40.607986 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 19:22:40 crc kubenswrapper[4722]: I0219 19:22:40.969080 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 19 19:22:41 crc kubenswrapper[4722]: I0219 19:22:41.087670 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 19 19:22:41 crc kubenswrapper[4722]: I0219 19:22:41.088729 4722 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="68634038-94c9-4bb4-a05e-3d47197c3f1e" Feb 19 19:22:41 crc kubenswrapper[4722]: I0219 19:22:41.376002 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 19 19:22:41 crc kubenswrapper[4722]: I0219 19:22:41.389871 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 19 19:22:41 crc kubenswrapper[4722]: I0219 19:22:41.679765 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 19 19:22:41 crc kubenswrapper[4722]: I0219 19:22:41.944594 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.080040 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.112340 4722 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.118772 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=27.118748037 podStartE2EDuration="27.118748037s" podCreationTimestamp="2026-02-19 19:22:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:22:33.746127609 +0000 UTC m=+253.358477933" watchObservedRunningTime="2026-02-19 19:22:42.118748037 +0000 UTC m=+261.731098411" Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.121049 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-ndzb8"] Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.121133 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.128556 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.144064 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=9.144044608 podStartE2EDuration="9.144044608s" podCreationTimestamp="2026-02-19 19:22:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:22:42.142945674 +0000 UTC m=+261.755296038" watchObservedRunningTime="2026-02-19 19:22:42.144044608 +0000 UTC m=+261.756394932" Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.309345 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.867684 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.874647 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.906878 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 19 19:22:42 crc kubenswrapper[4722]: I0219 19:22:42.987605 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 19:22:43 crc kubenswrapper[4722]: I0219 19:22:43.080691 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecc880c8-beb9-4081-8af6-64d2fa857901" path="/var/lib/kubelet/pods/ecc880c8-beb9-4081-8af6-64d2fa857901/volumes" Feb 19 19:22:43 crc kubenswrapper[4722]: I0219 19:22:43.143888 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 19 19:22:43 crc kubenswrapper[4722]: I0219 19:22:43.369015 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 19 19:22:43 crc kubenswrapper[4722]: I0219 19:22:43.963262 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 19 19:22:44 crc kubenswrapper[4722]: I0219 19:22:44.017753 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 19 19:22:44 crc kubenswrapper[4722]: I0219 19:22:44.166849 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 19 19:22:44 crc kubenswrapper[4722]: I0219 19:22:44.182004 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 19 19:22:44 crc kubenswrapper[4722]: I0219 19:22:44.424572 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 19 19:22:44 crc kubenswrapper[4722]: I0219 19:22:44.811863 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 19 19:22:44 crc kubenswrapper[4722]: I0219 19:22:44.912002 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 19 19:22:45 crc kubenswrapper[4722]: I0219 19:22:45.078815 4722 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 19:22:45 crc kubenswrapper[4722]: I0219 19:22:45.079010 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103" gracePeriod=5 Feb 19 19:22:45 crc kubenswrapper[4722]: I0219 19:22:45.125764 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 19 19:22:45 crc kubenswrapper[4722]: I0219 19:22:45.132734 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 19 19:22:45 crc kubenswrapper[4722]: I0219 19:22:45.559042 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 19:22:45 crc kubenswrapper[4722]: I0219 19:22:45.624753 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 19 19:22:45 crc kubenswrapper[4722]: I0219 19:22:45.841644 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 19 19:22:45 crc kubenswrapper[4722]: I0219 19:22:45.888401 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 19 19:22:45 crc kubenswrapper[4722]: I0219 19:22:45.941588 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.364120 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-58b6dc46cc-sf28m"] Feb 19 19:22:46 crc kubenswrapper[4722]: E0219 19:22:46.364361 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.364375 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 19:22:46 crc kubenswrapper[4722]: E0219 19:22:46.364391 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecc880c8-beb9-4081-8af6-64d2fa857901" containerName="oauth-openshift" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.364398 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc880c8-beb9-4081-8af6-64d2fa857901" containerName="oauth-openshift" Feb 19 19:22:46 crc kubenswrapper[4722]: E0219 19:22:46.364407 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" containerName="installer" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.364415 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" containerName="installer" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.364528 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecc880c8-beb9-4081-8af6-64d2fa857901" containerName="oauth-openshift" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.364546 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d98fac92-53aa-469c-b47e-4cc6edd91ef7" containerName="installer" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.364555 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.364954 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.368368 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.369771 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.369968 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.370367 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.370851 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.372097 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.372526 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.372949 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.373221 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.374211 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.378266 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.380970 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.381652 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.387733 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.400102 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538335 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538411 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538480 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzl2b\" (UniqueName: \"kubernetes.io/projected/473612c5-4d08-4767-adb9-4bfe5d8a05f1-kube-api-access-qzl2b\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538515 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538554 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-service-ca\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538602 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/473612c5-4d08-4767-adb9-4bfe5d8a05f1-audit-dir\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538633 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-template-login\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538666 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-audit-policies\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538702 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-session\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538749 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-template-error\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538787 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538840 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-router-certs\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538888 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.538937 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.640680 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/473612c5-4d08-4767-adb9-4bfe5d8a05f1-audit-dir\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.640737 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-template-login\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.640772 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-audit-policies\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.640805 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-session\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.640816 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/473612c5-4d08-4767-adb9-4bfe5d8a05f1-audit-dir\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.640854 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-template-error\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.640889 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.640931 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-router-certs\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.640964 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.641008 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.641048 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.641083 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.641143 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzl2b\" (UniqueName: \"kubernetes.io/projected/473612c5-4d08-4767-adb9-4bfe5d8a05f1-kube-api-access-qzl2b\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.641201 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.641241 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-service-ca\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.641919 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.642757 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-audit-policies\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.644304 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.645906 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-service-ca\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.649236 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-template-login\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.649537 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-session\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.650043 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.651070 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-template-error\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.651802 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.657424 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.657802 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-system-router-certs\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.658874 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/473612c5-4d08-4767-adb9-4bfe5d8a05f1-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.661174 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzl2b\" (UniqueName: \"kubernetes.io/projected/473612c5-4d08-4767-adb9-4bfe5d8a05f1-kube-api-access-qzl2b\") pod \"oauth-openshift-58b6dc46cc-sf28m\" (UID: \"473612c5-4d08-4767-adb9-4bfe5d8a05f1\") " pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.694971 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.709815 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.778466 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 19 19:22:46 crc kubenswrapper[4722]: I0219 19:22:46.941306 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 19 19:22:47 crc kubenswrapper[4722]: I0219 19:22:47.028003 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 19 19:22:47 crc kubenswrapper[4722]: I0219 19:22:47.042228 4722 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 19 19:22:47 crc kubenswrapper[4722]: I0219 19:22:47.107918 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 19 19:22:47 crc kubenswrapper[4722]: I0219 19:22:47.275130 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 19 19:22:47 crc kubenswrapper[4722]: I0219 19:22:47.395612 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 19 19:22:47 crc kubenswrapper[4722]: I0219 19:22:47.421053 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 19:22:47 crc kubenswrapper[4722]: I0219 19:22:47.454500 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 19 19:22:47 crc kubenswrapper[4722]: I0219 19:22:47.623934 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 19 19:22:47 crc kubenswrapper[4722]: I0219 19:22:47.819999 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 19 19:22:47 crc kubenswrapper[4722]: I0219 19:22:47.924014 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 19 19:22:48 crc kubenswrapper[4722]: I0219 19:22:48.085982 4722 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 19:22:48 crc kubenswrapper[4722]: I0219 19:22:48.346364 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 19 19:22:48 crc kubenswrapper[4722]: I0219 19:22:48.489794 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 19 19:22:48 crc kubenswrapper[4722]: I0219 19:22:48.629293 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 19:22:48 crc kubenswrapper[4722]: I0219 19:22:48.684289 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 19 19:22:48 crc kubenswrapper[4722]: I0219 19:22:48.871250 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 19 19:22:49 crc kubenswrapper[4722]: I0219 19:22:49.007432 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 19 19:22:49 crc kubenswrapper[4722]: I0219 19:22:49.354560 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 19 19:22:49 crc kubenswrapper[4722]: I0219 19:22:49.381144 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 19:22:49 crc kubenswrapper[4722]: I0219 19:22:49.476599 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 19 19:22:49 crc kubenswrapper[4722]: I0219 19:22:49.681001 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 19 19:22:49 crc kubenswrapper[4722]: I0219 19:22:49.768830 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 19 19:22:49 crc kubenswrapper[4722]: I0219 19:22:49.845343 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 19:22:49 crc kubenswrapper[4722]: I0219 19:22:49.855920 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 19 19:22:49 crc kubenswrapper[4722]: I0219 19:22:49.876696 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.058322 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.110609 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.179528 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.295314 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.337394 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.355257 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.390902 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.421655 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.529516 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.581253 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.581317 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.652854 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.675303 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.675386 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.689470 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.762591 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.784337 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.793323 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.793410 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.793438 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.793475 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.793532 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.793561 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.793604 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.793660 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.793787 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.794013 4722 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.794028 4722 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.794039 4722 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.794052 4722 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.800745 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.803698 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.826869 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.883278 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.887893 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.896041 4722 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.896206 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.931736 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 19 19:22:50 crc kubenswrapper[4722]: I0219 19:22:50.980284 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.029738 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.029793 4722 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103" exitCode=137 Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.029836 4722 scope.go:117] "RemoveContainer" containerID="91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.029869 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.047976 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.049053 4722 scope.go:117] "RemoveContainer" containerID="91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103" Feb 19 19:22:51 crc kubenswrapper[4722]: E0219 19:22:51.049445 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103\": container with ID starting with 91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103 not found: ID does not exist" containerID="91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.049477 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103"} err="failed to get container status \"91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103\": rpc error: code = NotFound desc = could not find container \"91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103\": container with ID starting with 91831d59c187f4ee9b188da0d929df78dbbd8d35a2609c9c13c1e3bb7b3e2103 not found: ID does not exist" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.081458 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.081823 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.094035 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.094069 4722 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c4acea01-a073-46c4-bacf-1743a4f16e02" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.099079 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.099120 4722 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c4acea01-a073-46c4-bacf-1743a4f16e02" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.277790 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.299361 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.398023 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.400745 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.424334 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.446044 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.531040 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.552862 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.560333 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.584227 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.635265 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.725851 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.788371 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.827232 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.855697 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 19 19:22:51 crc kubenswrapper[4722]: I0219 19:22:51.890763 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.038759 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.066208 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.167786 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.228307 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.268716 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.319615 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.338979 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.363307 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.451933 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.492064 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.560144 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.572046 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.575023 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.603602 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.701017 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.732947 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.758130 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.875738 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 19 19:22:52 crc kubenswrapper[4722]: I0219 19:22:52.966577 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.029845 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.116391 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.194423 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.202451 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.209471 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.221431 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.332419 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.444450 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.562642 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.573524 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.579527 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.599488 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.638712 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.771204 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 19 19:22:53 crc kubenswrapper[4722]: I0219 19:22:53.952971 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.288380 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.376533 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.465383 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.503323 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.527113 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.611485 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.622730 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.635800 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.652320 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.666870 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.742432 4722 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.770210 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-58b6dc46cc-sf28m"] Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.772367 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.774026 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 19 19:22:54 crc kubenswrapper[4722]: I0219 19:22:54.854292 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.187853 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.210087 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.221935 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.323086 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.332318 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.515311 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.527477 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.563725 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 19:22:55 crc kubenswrapper[4722]: E0219 19:22:55.576639 4722 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 19 19:22:55 crc kubenswrapper[4722]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-58b6dc46cc-sf28m_openshift-authentication_473612c5-4d08-4767-adb9-4bfe5d8a05f1_0(9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f): error adding pod openshift-authentication_oauth-openshift-58b6dc46cc-sf28m to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f" Netns:"/var/run/netns/1d21aa87-7cc6-424c-a263-79a9bd80ebe9" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-58b6dc46cc-sf28m;K8S_POD_INFRA_CONTAINER_ID=9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f;K8S_POD_UID=473612c5-4d08-4767-adb9-4bfe5d8a05f1" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m] networking: Multus: [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m/473612c5-4d08-4767-adb9-4bfe5d8a05f1]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-58b6dc46cc-sf28m in out of cluster comm: pod "oauth-openshift-58b6dc46cc-sf28m" not found Feb 19 19:22:55 crc kubenswrapper[4722]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 19 19:22:55 crc kubenswrapper[4722]: > Feb 19 19:22:55 crc kubenswrapper[4722]: E0219 19:22:55.576710 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 19 19:22:55 crc kubenswrapper[4722]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-58b6dc46cc-sf28m_openshift-authentication_473612c5-4d08-4767-adb9-4bfe5d8a05f1_0(9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f): error adding pod openshift-authentication_oauth-openshift-58b6dc46cc-sf28m to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f" Netns:"/var/run/netns/1d21aa87-7cc6-424c-a263-79a9bd80ebe9" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-58b6dc46cc-sf28m;K8S_POD_INFRA_CONTAINER_ID=9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f;K8S_POD_UID=473612c5-4d08-4767-adb9-4bfe5d8a05f1" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m] networking: Multus: [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m/473612c5-4d08-4767-adb9-4bfe5d8a05f1]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-58b6dc46cc-sf28m in out of cluster comm: pod "oauth-openshift-58b6dc46cc-sf28m" not found Feb 19 19:22:55 crc kubenswrapper[4722]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 19 19:22:55 crc kubenswrapper[4722]: > pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:55 crc kubenswrapper[4722]: E0219 19:22:55.576731 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 19 19:22:55 crc kubenswrapper[4722]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-58b6dc46cc-sf28m_openshift-authentication_473612c5-4d08-4767-adb9-4bfe5d8a05f1_0(9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f): error adding pod openshift-authentication_oauth-openshift-58b6dc46cc-sf28m to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f" Netns:"/var/run/netns/1d21aa87-7cc6-424c-a263-79a9bd80ebe9" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-58b6dc46cc-sf28m;K8S_POD_INFRA_CONTAINER_ID=9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f;K8S_POD_UID=473612c5-4d08-4767-adb9-4bfe5d8a05f1" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m] networking: Multus: [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m/473612c5-4d08-4767-adb9-4bfe5d8a05f1]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-58b6dc46cc-sf28m in out of cluster comm: pod "oauth-openshift-58b6dc46cc-sf28m" not found Feb 19 19:22:55 crc kubenswrapper[4722]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 19 19:22:55 crc kubenswrapper[4722]: > pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:55 crc kubenswrapper[4722]: E0219 19:22:55.576778 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-58b6dc46cc-sf28m_openshift-authentication(473612c5-4d08-4767-adb9-4bfe5d8a05f1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-58b6dc46cc-sf28m_openshift-authentication(473612c5-4d08-4767-adb9-4bfe5d8a05f1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-58b6dc46cc-sf28m_openshift-authentication_473612c5-4d08-4767-adb9-4bfe5d8a05f1_0(9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f): error adding pod openshift-authentication_oauth-openshift-58b6dc46cc-sf28m to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f\\\" Netns:\\\"/var/run/netns/1d21aa87-7cc6-424c-a263-79a9bd80ebe9\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-58b6dc46cc-sf28m;K8S_POD_INFRA_CONTAINER_ID=9673c0e1e6f3aee95509e1a3cc7753912aee9c96e447dd4f9c48b26619d7726f;K8S_POD_UID=473612c5-4d08-4767-adb9-4bfe5d8a05f1\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m] networking: Multus: [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m/473612c5-4d08-4767-adb9-4bfe5d8a05f1]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-58b6dc46cc-sf28m in out of cluster comm: pod \\\"oauth-openshift-58b6dc46cc-sf28m\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" podUID="473612c5-4d08-4767-adb9-4bfe5d8a05f1" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.629901 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.641324 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.656486 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.832272 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.862853 4722 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 19:22:55 crc kubenswrapper[4722]: I0219 19:22:55.899626 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.016676 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.062186 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.062970 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.109222 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.133849 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.214744 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.238682 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.249969 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.257166 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.349378 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.382933 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.679374 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.772863 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.871867 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 19 19:22:56 crc kubenswrapper[4722]: I0219 19:22:56.935974 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.030536 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.073354 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.081606 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.185866 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.216034 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.229604 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.244484 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.298215 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.531409 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.552576 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.576209 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.588632 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.697669 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 19 19:22:57 crc kubenswrapper[4722]: I0219 19:22:57.864062 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.086640 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.114962 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.127495 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.161495 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.184107 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.185450 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.319023 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.348418 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.493566 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.605292 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.610287 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.641264 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.668278 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.771525 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.858472 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.901766 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.907432 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 19:22:58 crc kubenswrapper[4722]: I0219 19:22:58.945760 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.083478 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.158784 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.163924 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.219578 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.238692 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 19 19:22:59 crc kubenswrapper[4722]: E0219 19:22:59.258569 4722 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 19 19:22:59 crc kubenswrapper[4722]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-58b6dc46cc-sf28m_openshift-authentication_473612c5-4d08-4767-adb9-4bfe5d8a05f1_0(3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621): error adding pod openshift-authentication_oauth-openshift-58b6dc46cc-sf28m to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621" Netns:"/var/run/netns/741d5bf7-eb20-4ac0-aaaa-6d28804d51f5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-58b6dc46cc-sf28m;K8S_POD_INFRA_CONTAINER_ID=3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621;K8S_POD_UID=473612c5-4d08-4767-adb9-4bfe5d8a05f1" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m] networking: Multus: [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m/473612c5-4d08-4767-adb9-4bfe5d8a05f1]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-58b6dc46cc-sf28m in out of cluster comm: pod "oauth-openshift-58b6dc46cc-sf28m" not found Feb 19 19:22:59 crc kubenswrapper[4722]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 19 19:22:59 crc kubenswrapper[4722]: > Feb 19 19:22:59 crc kubenswrapper[4722]: E0219 19:22:59.258799 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 19 19:22:59 crc kubenswrapper[4722]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-58b6dc46cc-sf28m_openshift-authentication_473612c5-4d08-4767-adb9-4bfe5d8a05f1_0(3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621): error adding pod openshift-authentication_oauth-openshift-58b6dc46cc-sf28m to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621" Netns:"/var/run/netns/741d5bf7-eb20-4ac0-aaaa-6d28804d51f5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-58b6dc46cc-sf28m;K8S_POD_INFRA_CONTAINER_ID=3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621;K8S_POD_UID=473612c5-4d08-4767-adb9-4bfe5d8a05f1" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m] networking: Multus: [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m/473612c5-4d08-4767-adb9-4bfe5d8a05f1]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-58b6dc46cc-sf28m in out of cluster comm: pod "oauth-openshift-58b6dc46cc-sf28m" not found Feb 19 19:22:59 crc kubenswrapper[4722]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 19 19:22:59 crc kubenswrapper[4722]: > pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:59 crc kubenswrapper[4722]: E0219 19:22:59.258825 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 19 19:22:59 crc kubenswrapper[4722]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-58b6dc46cc-sf28m_openshift-authentication_473612c5-4d08-4767-adb9-4bfe5d8a05f1_0(3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621): error adding pod openshift-authentication_oauth-openshift-58b6dc46cc-sf28m to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621" Netns:"/var/run/netns/741d5bf7-eb20-4ac0-aaaa-6d28804d51f5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-58b6dc46cc-sf28m;K8S_POD_INFRA_CONTAINER_ID=3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621;K8S_POD_UID=473612c5-4d08-4767-adb9-4bfe5d8a05f1" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m] networking: Multus: [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m/473612c5-4d08-4767-adb9-4bfe5d8a05f1]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-58b6dc46cc-sf28m in out of cluster comm: pod "oauth-openshift-58b6dc46cc-sf28m" not found Feb 19 19:22:59 crc kubenswrapper[4722]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 19 19:22:59 crc kubenswrapper[4722]: > pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:22:59 crc kubenswrapper[4722]: E0219 19:22:59.258886 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-58b6dc46cc-sf28m_openshift-authentication(473612c5-4d08-4767-adb9-4bfe5d8a05f1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-58b6dc46cc-sf28m_openshift-authentication(473612c5-4d08-4767-adb9-4bfe5d8a05f1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-58b6dc46cc-sf28m_openshift-authentication_473612c5-4d08-4767-adb9-4bfe5d8a05f1_0(3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621): error adding pod openshift-authentication_oauth-openshift-58b6dc46cc-sf28m to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621\\\" Netns:\\\"/var/run/netns/741d5bf7-eb20-4ac0-aaaa-6d28804d51f5\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-58b6dc46cc-sf28m;K8S_POD_INFRA_CONTAINER_ID=3960c8c6026b3fc2fd47e9686517f5b1efea2eda4d22b1fb487f979611b4c621;K8S_POD_UID=473612c5-4d08-4767-adb9-4bfe5d8a05f1\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m] networking: Multus: [openshift-authentication/oauth-openshift-58b6dc46cc-sf28m/473612c5-4d08-4767-adb9-4bfe5d8a05f1]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-58b6dc46cc-sf28m in out of cluster comm: pod \\\"oauth-openshift-58b6dc46cc-sf28m\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" podUID="473612c5-4d08-4767-adb9-4bfe5d8a05f1" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.373940 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.527716 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.665607 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.670602 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.773448 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.822035 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 19:22:59 crc kubenswrapper[4722]: I0219 19:22:59.972498 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.060135 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.295844 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.457920 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.581786 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.581892 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.581968 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.583589 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"93047d095dae322ea99676114c0cf5e81fab8c46f8e890c45706dc12f908b329"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.584452 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://93047d095dae322ea99676114c0cf5e81fab8c46f8e890c45706dc12f908b329" gracePeriod=30 Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.649483 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.691758 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.767574 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.800500 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.833081 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 19 19:23:00 crc kubenswrapper[4722]: I0219 19:23:00.928144 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 19 19:23:01 crc kubenswrapper[4722]: I0219 19:23:01.075645 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 19 19:23:01 crc kubenswrapper[4722]: I0219 19:23:01.279701 4722 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 19 19:23:01 crc kubenswrapper[4722]: I0219 19:23:01.334402 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 19 19:23:01 crc kubenswrapper[4722]: I0219 19:23:01.680839 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 19 19:23:01 crc kubenswrapper[4722]: I0219 19:23:01.846858 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 19 19:23:01 crc kubenswrapper[4722]: I0219 19:23:01.944633 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 19 19:23:02 crc kubenswrapper[4722]: I0219 19:23:02.093914 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 19:23:02 crc kubenswrapper[4722]: I0219 19:23:02.107495 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 19 19:23:02 crc kubenswrapper[4722]: I0219 19:23:02.142464 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 19 19:23:02 crc kubenswrapper[4722]: I0219 19:23:02.310105 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 19 19:23:02 crc kubenswrapper[4722]: I0219 19:23:02.779515 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 19 19:23:02 crc kubenswrapper[4722]: I0219 19:23:02.781300 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 19 19:23:02 crc kubenswrapper[4722]: I0219 19:23:02.783494 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 19:23:02 crc kubenswrapper[4722]: I0219 19:23:02.954820 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 19 19:23:12 crc kubenswrapper[4722]: I0219 19:23:12.070364 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:23:12 crc kubenswrapper[4722]: I0219 19:23:12.071497 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:23:12 crc kubenswrapper[4722]: I0219 19:23:12.502271 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-58b6dc46cc-sf28m"] Feb 19 19:23:12 crc kubenswrapper[4722]: W0219 19:23:12.510251 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod473612c5_4d08_4767_adb9_4bfe5d8a05f1.slice/crio-0e1af4ccfce161b443a1156dcfb4974738aa7ceda7368052a7694d190bfe11c4 WatchSource:0}: Error finding container 0e1af4ccfce161b443a1156dcfb4974738aa7ceda7368052a7694d190bfe11c4: Status 404 returned error can't find the container with id 0e1af4ccfce161b443a1156dcfb4974738aa7ceda7368052a7694d190bfe11c4 Feb 19 19:23:13 crc kubenswrapper[4722]: I0219 19:23:13.171327 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" event={"ID":"473612c5-4d08-4767-adb9-4bfe5d8a05f1","Type":"ContainerStarted","Data":"998f033948c34dc1661b15f368dc8dbd76aaf915eba4c2403bcea03a739e915f"} Feb 19 19:23:13 crc kubenswrapper[4722]: I0219 19:23:13.171770 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:23:13 crc kubenswrapper[4722]: I0219 19:23:13.171791 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" event={"ID":"473612c5-4d08-4767-adb9-4bfe5d8a05f1","Type":"ContainerStarted","Data":"0e1af4ccfce161b443a1156dcfb4974738aa7ceda7368052a7694d190bfe11c4"} Feb 19 19:23:13 crc kubenswrapper[4722]: I0219 19:23:13.180722 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" Feb 19 19:23:13 crc kubenswrapper[4722]: I0219 19:23:13.205220 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-58b6dc46cc-sf28m" podStartSLOduration=71.205198606 podStartE2EDuration="1m11.205198606s" podCreationTimestamp="2026-02-19 19:22:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:23:13.197755195 +0000 UTC m=+292.810105529" watchObservedRunningTime="2026-02-19 19:23:13.205198606 +0000 UTC m=+292.817548960" Feb 19 19:23:18 crc kubenswrapper[4722]: I0219 19:23:18.211696 4722 generic.go:334] "Generic (PLEG): container finished" podID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerID="3e07f956af5d9519f0aa46f0dd27ff59f1b20703afc1f6ad3a69b934175a5145" exitCode=0 Feb 19 19:23:18 crc kubenswrapper[4722]: I0219 19:23:18.212262 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" event={"ID":"cb6886b7-9193-4c89-96c8-64b61c3251a4","Type":"ContainerDied","Data":"3e07f956af5d9519f0aa46f0dd27ff59f1b20703afc1f6ad3a69b934175a5145"} Feb 19 19:23:18 crc kubenswrapper[4722]: I0219 19:23:18.212936 4722 scope.go:117] "RemoveContainer" containerID="3e07f956af5d9519f0aa46f0dd27ff59f1b20703afc1f6ad3a69b934175a5145" Feb 19 19:23:19 crc kubenswrapper[4722]: I0219 19:23:19.220015 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" event={"ID":"cb6886b7-9193-4c89-96c8-64b61c3251a4","Type":"ContainerStarted","Data":"1816806692e1e38c26b8744d0c3544e4e5966028d1e6511ad4491cc5f00ba0fc"} Feb 19 19:23:19 crc kubenswrapper[4722]: I0219 19:23:19.220634 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:23:19 crc kubenswrapper[4722]: I0219 19:23:19.222959 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:23:20 crc kubenswrapper[4722]: I0219 19:23:20.879476 4722 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 19 19:23:31 crc kubenswrapper[4722]: I0219 19:23:31.286796 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 19 19:23:31 crc kubenswrapper[4722]: I0219 19:23:31.289205 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 19 19:23:31 crc kubenswrapper[4722]: I0219 19:23:31.289260 4722 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="93047d095dae322ea99676114c0cf5e81fab8c46f8e890c45706dc12f908b329" exitCode=137 Feb 19 19:23:31 crc kubenswrapper[4722]: I0219 19:23:31.289289 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"93047d095dae322ea99676114c0cf5e81fab8c46f8e890c45706dc12f908b329"} Feb 19 19:23:31 crc kubenswrapper[4722]: I0219 19:23:31.289325 4722 scope.go:117] "RemoveContainer" containerID="985d8a85273666cc55570e92e466ce9490e6da199f8fb08cb7c130d1d191686c" Feb 19 19:23:32 crc kubenswrapper[4722]: I0219 19:23:32.296586 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 19 19:23:32 crc kubenswrapper[4722]: I0219 19:23:32.297927 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"14d170beba3bf45db7aa9b935595723d9de04df3941a7da55d0113f9e65c3d49"} Feb 19 19:23:40 crc kubenswrapper[4722]: I0219 19:23:40.581735 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:23:40 crc kubenswrapper[4722]: I0219 19:23:40.588071 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:23:41 crc kubenswrapper[4722]: I0219 19:23:41.351640 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:23:41 crc kubenswrapper[4722]: I0219 19:23:41.355396 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 19:24:11 crc kubenswrapper[4722]: I0219 19:24:11.798874 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:24:11 crc kubenswrapper[4722]: I0219 19:24:11.799464 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.278786 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-whpmj"] Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.280494 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.297861 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-whpmj"] Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.300771 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-registry-certificates\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.300844 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-bound-sa-token\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.300892 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.300955 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.301002 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-trusted-ca\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.301039 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-registry-tls\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.301076 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.301126 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nqxm\" (UniqueName: \"kubernetes.io/projected/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-kube-api-access-9nqxm\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.330829 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.402180 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-registry-certificates\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.402245 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-bound-sa-token\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.402296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.402333 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-trusted-ca\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.402356 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-registry-tls\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.402377 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.402412 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nqxm\" (UniqueName: \"kubernetes.io/projected/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-kube-api-access-9nqxm\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.403145 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-ca-trust-extracted\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.403814 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-trusted-ca\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.403940 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-registry-certificates\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.408898 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-registry-tls\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.409791 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-installation-pull-secrets\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.421896 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nqxm\" (UniqueName: \"kubernetes.io/projected/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-kube-api-access-9nqxm\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.430875 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc-bound-sa-token\") pod \"image-registry-66df7c8f76-whpmj\" (UID: \"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc\") " pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.598131 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:18 crc kubenswrapper[4722]: I0219 19:24:18.811851 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-whpmj"] Feb 19 19:24:19 crc kubenswrapper[4722]: I0219 19:24:19.574432 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" event={"ID":"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc","Type":"ContainerStarted","Data":"37defc6843c54ec20eea2e2778d80087362ecf2eb99fd8d2a18d8480bec583c0"} Feb 19 19:24:19 crc kubenswrapper[4722]: I0219 19:24:19.574917 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:19 crc kubenswrapper[4722]: I0219 19:24:19.574941 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" event={"ID":"e22e4e21-a0d2-408a-aaa1-b0ab4156e7bc","Type":"ContainerStarted","Data":"46fd38d27d001eb42e066728c63250b9b035822d4354987401399c0d13036dad"} Feb 19 19:24:38 crc kubenswrapper[4722]: I0219 19:24:38.611487 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" Feb 19 19:24:38 crc kubenswrapper[4722]: I0219 19:24:38.645351 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-whpmj" podStartSLOduration=20.645326155 podStartE2EDuration="20.645326155s" podCreationTimestamp="2026-02-19 19:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:24:19.613521893 +0000 UTC m=+359.225872257" watchObservedRunningTime="2026-02-19 19:24:38.645326155 +0000 UTC m=+378.257676509" Feb 19 19:24:38 crc kubenswrapper[4722]: I0219 19:24:38.695590 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6bqq"] Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.206504 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6tp9x"] Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.206830 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6tp9x" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerName="registry-server" containerID="cri-o://8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75" gracePeriod=30 Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.231077 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-64frs"] Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.231694 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-64frs" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" containerName="registry-server" containerID="cri-o://d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5" gracePeriod=30 Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.243728 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4gbkr"] Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.243986 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" podUID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerName="marketplace-operator" containerID="cri-o://1816806692e1e38c26b8744d0c3544e4e5966028d1e6511ad4491cc5f00ba0fc" gracePeriod=30 Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.260844 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vqqrf"] Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.261127 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vqqrf" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerName="registry-server" containerID="cri-o://5ad81a5a39e1d2d4c131bcf5c486bacca24698453f66dd8aa32cd630c49e4b9c" gracePeriod=30 Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.270121 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lrwfz"] Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.270960 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.285131 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rnljk"] Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.285397 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rnljk" podUID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerName="registry-server" containerID="cri-o://46fb6dc449baf9d204637234c7660e38bd2e8d2f352111d61b07600262a339ee" gracePeriod=30 Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.289503 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lrwfz"] Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.329567 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwvqv\" (UniqueName: \"kubernetes.io/projected/6fb12d29-ac35-4e04-a25d-05b1b2545b81-kube-api-access-qwvqv\") pod \"marketplace-operator-79b997595-lrwfz\" (UID: \"6fb12d29-ac35-4e04-a25d-05b1b2545b81\") " pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.329651 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6fb12d29-ac35-4e04-a25d-05b1b2545b81-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lrwfz\" (UID: \"6fb12d29-ac35-4e04-a25d-05b1b2545b81\") " pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:39 crc kubenswrapper[4722]: I0219 19:24:39.329709 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6fb12d29-ac35-4e04-a25d-05b1b2545b81-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lrwfz\" (UID: \"6fb12d29-ac35-4e04-a25d-05b1b2545b81\") " pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.431590 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwvqv\" (UniqueName: \"kubernetes.io/projected/6fb12d29-ac35-4e04-a25d-05b1b2545b81-kube-api-access-qwvqv\") pod \"marketplace-operator-79b997595-lrwfz\" (UID: \"6fb12d29-ac35-4e04-a25d-05b1b2545b81\") " pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.431678 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6fb12d29-ac35-4e04-a25d-05b1b2545b81-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lrwfz\" (UID: \"6fb12d29-ac35-4e04-a25d-05b1b2545b81\") " pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.431728 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6fb12d29-ac35-4e04-a25d-05b1b2545b81-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lrwfz\" (UID: \"6fb12d29-ac35-4e04-a25d-05b1b2545b81\") " pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.433085 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6fb12d29-ac35-4e04-a25d-05b1b2545b81-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-lrwfz\" (UID: \"6fb12d29-ac35-4e04-a25d-05b1b2545b81\") " pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.449846 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6fb12d29-ac35-4e04-a25d-05b1b2545b81-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-lrwfz\" (UID: \"6fb12d29-ac35-4e04-a25d-05b1b2545b81\") " pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.456525 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwvqv\" (UniqueName: \"kubernetes.io/projected/6fb12d29-ac35-4e04-a25d-05b1b2545b81-kube-api-access-qwvqv\") pod \"marketplace-operator-79b997595-lrwfz\" (UID: \"6fb12d29-ac35-4e04-a25d-05b1b2545b81\") " pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.597746 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.601451 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.633201 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-catalog-content\") pod \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.633271 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-utilities\") pod \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.633854 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvbkr\" (UniqueName: \"kubernetes.io/projected/396bbbdf-7f78-48e7-b02c-0737c221aaa6-kube-api-access-nvbkr\") pod \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\" (UID: \"396bbbdf-7f78-48e7-b02c-0737c221aaa6\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.634344 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-utilities" (OuterVolumeSpecName: "utilities") pod "396bbbdf-7f78-48e7-b02c-0737c221aaa6" (UID: "396bbbdf-7f78-48e7-b02c-0737c221aaa6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.634684 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.637095 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/396bbbdf-7f78-48e7-b02c-0737c221aaa6-kube-api-access-nvbkr" (OuterVolumeSpecName: "kube-api-access-nvbkr") pod "396bbbdf-7f78-48e7-b02c-0737c221aaa6" (UID: "396bbbdf-7f78-48e7-b02c-0737c221aaa6"). InnerVolumeSpecName "kube-api-access-nvbkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: E0219 19:24:39.668417 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5 is running failed: container process not found" containerID="d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5" cmd=["grpc_health_probe","-addr=:50051"] Feb 19 19:24:40 crc kubenswrapper[4722]: E0219 19:24:39.668952 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5 is running failed: container process not found" containerID="d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5" cmd=["grpc_health_probe","-addr=:50051"] Feb 19 19:24:40 crc kubenswrapper[4722]: E0219 19:24:39.669646 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5 is running failed: container process not found" containerID="d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5" cmd=["grpc_health_probe","-addr=:50051"] Feb 19 19:24:40 crc kubenswrapper[4722]: E0219 19:24:39.669696 4722 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-64frs" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" containerName="registry-server" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.696449 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "396bbbdf-7f78-48e7-b02c-0737c221aaa6" (UID: "396bbbdf-7f78-48e7-b02c-0737c221aaa6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.720266 4722 generic.go:334] "Generic (PLEG): container finished" podID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerID="8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75" exitCode=0 Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.720351 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tp9x" event={"ID":"396bbbdf-7f78-48e7-b02c-0737c221aaa6","Type":"ContainerDied","Data":"8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.720390 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tp9x" event={"ID":"396bbbdf-7f78-48e7-b02c-0737c221aaa6","Type":"ContainerDied","Data":"8c40a4539d5d6930a5a906cb44965a1810a1f2192dbfb01db14eeaf97f5cc6ee"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.720415 4722 scope.go:117] "RemoveContainer" containerID="8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.720599 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tp9x" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.730576 4722 generic.go:334] "Generic (PLEG): container finished" podID="0c9d3632-a132-4377-95ef-564cffb1f299" containerID="d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5" exitCode=0 Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.730699 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64frs" event={"ID":"0c9d3632-a132-4377-95ef-564cffb1f299","Type":"ContainerDied","Data":"d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.737823 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/396bbbdf-7f78-48e7-b02c-0737c221aaa6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.737864 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvbkr\" (UniqueName: \"kubernetes.io/projected/396bbbdf-7f78-48e7-b02c-0737c221aaa6-kube-api-access-nvbkr\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.740784 4722 generic.go:334] "Generic (PLEG): container finished" podID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerID="5ad81a5a39e1d2d4c131bcf5c486bacca24698453f66dd8aa32cd630c49e4b9c" exitCode=0 Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.740848 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vqqrf" event={"ID":"f10dae1c-d938-4cce-893b-4ad7eca7d23f","Type":"ContainerDied","Data":"5ad81a5a39e1d2d4c131bcf5c486bacca24698453f66dd8aa32cd630c49e4b9c"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.744994 4722 generic.go:334] "Generic (PLEG): container finished" podID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerID="46fb6dc449baf9d204637234c7660e38bd2e8d2f352111d61b07600262a339ee" exitCode=0 Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.745093 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnljk" event={"ID":"2bb14baa-8bfc-415a-aa95-50b79f3c75ea","Type":"ContainerDied","Data":"46fb6dc449baf9d204637234c7660e38bd2e8d2f352111d61b07600262a339ee"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.752690 4722 generic.go:334] "Generic (PLEG): container finished" podID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerID="1816806692e1e38c26b8744d0c3544e4e5966028d1e6511ad4491cc5f00ba0fc" exitCode=0 Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.752732 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" event={"ID":"cb6886b7-9193-4c89-96c8-64b61c3251a4","Type":"ContainerDied","Data":"1816806692e1e38c26b8744d0c3544e4e5966028d1e6511ad4491cc5f00ba0fc"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.757092 4722 scope.go:117] "RemoveContainer" containerID="df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.757777 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6tp9x"] Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.763635 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6tp9x"] Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.785037 4722 scope.go:117] "RemoveContainer" containerID="4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.800305 4722 scope.go:117] "RemoveContainer" containerID="8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75" Feb 19 19:24:40 crc kubenswrapper[4722]: E0219 19:24:39.800797 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75\": container with ID starting with 8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75 not found: ID does not exist" containerID="8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.800827 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75"} err="failed to get container status \"8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75\": rpc error: code = NotFound desc = could not find container \"8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75\": container with ID starting with 8b93309437c20cfb2028664d24c8f0d05dfa553d9e1ed62b4058e22a3437ae75 not found: ID does not exist" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.800852 4722 scope.go:117] "RemoveContainer" containerID="df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081" Feb 19 19:24:40 crc kubenswrapper[4722]: E0219 19:24:39.801137 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081\": container with ID starting with df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081 not found: ID does not exist" containerID="df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.801174 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081"} err="failed to get container status \"df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081\": rpc error: code = NotFound desc = could not find container \"df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081\": container with ID starting with df018e3e77845c4a14dcdc8c18e267d832cc8ca07d477af33aeda00aaf541081 not found: ID does not exist" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.801271 4722 scope.go:117] "RemoveContainer" containerID="4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6" Feb 19 19:24:40 crc kubenswrapper[4722]: E0219 19:24:39.801702 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6\": container with ID starting with 4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6 not found: ID does not exist" containerID="4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.801728 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6"} err="failed to get container status \"4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6\": rpc error: code = NotFound desc = could not find container \"4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6\": container with ID starting with 4dfab637a30a07b46d85902a934c0e59624523e2aba07d70b027c90fc057cea6 not found: ID does not exist" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:39.801745 4722 scope.go:117] "RemoveContainer" containerID="3e07f956af5d9519f0aa46f0dd27ff59f1b20703afc1f6ad3a69b934175a5145" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.313143 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.328622 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-64frs" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.347840 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-utilities\") pod \"0c9d3632-a132-4377-95ef-564cffb1f299\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.347964 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-catalog-content\") pod \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.348117 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9szc5\" (UniqueName: \"kubernetes.io/projected/0c9d3632-a132-4377-95ef-564cffb1f299-kube-api-access-9szc5\") pod \"0c9d3632-a132-4377-95ef-564cffb1f299\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.348222 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-catalog-content\") pod \"0c9d3632-a132-4377-95ef-564cffb1f299\" (UID: \"0c9d3632-a132-4377-95ef-564cffb1f299\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.348293 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz4g5\" (UniqueName: \"kubernetes.io/projected/f10dae1c-d938-4cce-893b-4ad7eca7d23f-kube-api-access-fz4g5\") pod \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.348432 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-utilities\") pod \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\" (UID: \"f10dae1c-d938-4cce-893b-4ad7eca7d23f\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.348462 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-utilities" (OuterVolumeSpecName: "utilities") pod "0c9d3632-a132-4377-95ef-564cffb1f299" (UID: "0c9d3632-a132-4377-95ef-564cffb1f299"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.348936 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.349528 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-utilities" (OuterVolumeSpecName: "utilities") pod "f10dae1c-d938-4cce-893b-4ad7eca7d23f" (UID: "f10dae1c-d938-4cce-893b-4ad7eca7d23f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.358200 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c9d3632-a132-4377-95ef-564cffb1f299-kube-api-access-9szc5" (OuterVolumeSpecName: "kube-api-access-9szc5") pod "0c9d3632-a132-4377-95ef-564cffb1f299" (UID: "0c9d3632-a132-4377-95ef-564cffb1f299"). InnerVolumeSpecName "kube-api-access-9szc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.358632 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f10dae1c-d938-4cce-893b-4ad7eca7d23f-kube-api-access-fz4g5" (OuterVolumeSpecName: "kube-api-access-fz4g5") pod "f10dae1c-d938-4cce-893b-4ad7eca7d23f" (UID: "f10dae1c-d938-4cce-893b-4ad7eca7d23f"). InnerVolumeSpecName "kube-api-access-fz4g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.373794 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f10dae1c-d938-4cce-893b-4ad7eca7d23f" (UID: "f10dae1c-d938-4cce-893b-4ad7eca7d23f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.405091 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c9d3632-a132-4377-95ef-564cffb1f299" (UID: "0c9d3632-a132-4377-95ef-564cffb1f299"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.408132 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.409727 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449360 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-utilities\") pod \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449438 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-catalog-content\") pod \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449506 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8z64\" (UniqueName: \"kubernetes.io/projected/cb6886b7-9193-4c89-96c8-64b61c3251a4-kube-api-access-z8z64\") pod \"cb6886b7-9193-4c89-96c8-64b61c3251a4\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449562 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-trusted-ca\") pod \"cb6886b7-9193-4c89-96c8-64b61c3251a4\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449581 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbw2p\" (UniqueName: \"kubernetes.io/projected/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-kube-api-access-sbw2p\") pod \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\" (UID: \"2bb14baa-8bfc-415a-aa95-50b79f3c75ea\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449606 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-operator-metrics\") pod \"cb6886b7-9193-4c89-96c8-64b61c3251a4\" (UID: \"cb6886b7-9193-4c89-96c8-64b61c3251a4\") " Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449819 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c9d3632-a132-4377-95ef-564cffb1f299-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449835 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz4g5\" (UniqueName: \"kubernetes.io/projected/f10dae1c-d938-4cce-893b-4ad7eca7d23f-kube-api-access-fz4g5\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449847 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449857 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f10dae1c-d938-4cce-893b-4ad7eca7d23f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.449891 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9szc5\" (UniqueName: \"kubernetes.io/projected/0c9d3632-a132-4377-95ef-564cffb1f299-kube-api-access-9szc5\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.450114 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-utilities" (OuterVolumeSpecName: "utilities") pod "2bb14baa-8bfc-415a-aa95-50b79f3c75ea" (UID: "2bb14baa-8bfc-415a-aa95-50b79f3c75ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.450404 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "cb6886b7-9193-4c89-96c8-64b61c3251a4" (UID: "cb6886b7-9193-4c89-96c8-64b61c3251a4"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.453006 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-kube-api-access-sbw2p" (OuterVolumeSpecName: "kube-api-access-sbw2p") pod "2bb14baa-8bfc-415a-aa95-50b79f3c75ea" (UID: "2bb14baa-8bfc-415a-aa95-50b79f3c75ea"). InnerVolumeSpecName "kube-api-access-sbw2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.453299 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb6886b7-9193-4c89-96c8-64b61c3251a4-kube-api-access-z8z64" (OuterVolumeSpecName: "kube-api-access-z8z64") pod "cb6886b7-9193-4c89-96c8-64b61c3251a4" (UID: "cb6886b7-9193-4c89-96c8-64b61c3251a4"). InnerVolumeSpecName "kube-api-access-z8z64". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.454693 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "cb6886b7-9193-4c89-96c8-64b61c3251a4" (UID: "cb6886b7-9193-4c89-96c8-64b61c3251a4"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.490582 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-lrwfz"] Feb 19 19:24:40 crc kubenswrapper[4722]: W0219 19:24:40.492309 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fb12d29_ac35_4e04_a25d_05b1b2545b81.slice/crio-5524dd0fe1ca69174883cbf1ca3edafe1177f8b83545612453c0dc82cc0d2b50 WatchSource:0}: Error finding container 5524dd0fe1ca69174883cbf1ca3edafe1177f8b83545612453c0dc82cc0d2b50: Status 404 returned error can't find the container with id 5524dd0fe1ca69174883cbf1ca3edafe1177f8b83545612453c0dc82cc0d2b50 Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.550728 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbw2p\" (UniqueName: \"kubernetes.io/projected/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-kube-api-access-sbw2p\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.550756 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.550768 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.550777 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8z64\" (UniqueName: \"kubernetes.io/projected/cb6886b7-9193-4c89-96c8-64b61c3251a4-kube-api-access-z8z64\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.550786 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb6886b7-9193-4c89-96c8-64b61c3251a4-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.602829 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2bb14baa-8bfc-415a-aa95-50b79f3c75ea" (UID: "2bb14baa-8bfc-415a-aa95-50b79f3c75ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.651816 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bb14baa-8bfc-415a-aa95-50b79f3c75ea-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.759476 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" event={"ID":"6fb12d29-ac35-4e04-a25d-05b1b2545b81","Type":"ContainerStarted","Data":"29bdb6089b8cc7071e4915a0ffa320ee1fb628b638ea3c2c0929496c59394785"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.759525 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" event={"ID":"6fb12d29-ac35-4e04-a25d-05b1b2545b81","Type":"ContainerStarted","Data":"5524dd0fe1ca69174883cbf1ca3edafe1177f8b83545612453c0dc82cc0d2b50"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.759952 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.760574 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-lrwfz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" start-of-body= Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.760614 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" podUID="6fb12d29-ac35-4e04-a25d-05b1b2545b81" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.761762 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-64frs" event={"ID":"0c9d3632-a132-4377-95ef-564cffb1f299","Type":"ContainerDied","Data":"d33d192f020b6508198a4a19887938ad42d94be353afef74a8413b4aa30e91d1"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.761805 4722 scope.go:117] "RemoveContainer" containerID="d03cccfd1a082ecc54f59999686b5bddaa098094f358428bde6ef8d24f4826d5" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.761878 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-64frs" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.763649 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vqqrf" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.763658 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vqqrf" event={"ID":"f10dae1c-d938-4cce-893b-4ad7eca7d23f","Type":"ContainerDied","Data":"104233a8c5f814fc84e4081cc01af39a90044fcd055492fd733214b7e3b634d4"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.769639 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnljk" event={"ID":"2bb14baa-8bfc-415a-aa95-50b79f3c75ea","Type":"ContainerDied","Data":"1c1bf847d9c8bd6cdac4a8d78654087bcd70cd49df2904b71c207590aa5bdd28"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.769689 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnljk" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.773391 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" event={"ID":"cb6886b7-9193-4c89-96c8-64b61c3251a4","Type":"ContainerDied","Data":"d0c096f9abea14bd89e01cd5df78cfd43109b66f0678b624949e1ec87cdc1cd4"} Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.773459 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4gbkr" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.782951 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" podStartSLOduration=1.782933011 podStartE2EDuration="1.782933011s" podCreationTimestamp="2026-02-19 19:24:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:24:40.781524627 +0000 UTC m=+380.393874981" watchObservedRunningTime="2026-02-19 19:24:40.782933011 +0000 UTC m=+380.395283325" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.796132 4722 scope.go:117] "RemoveContainer" containerID="ecdd2f0fffaf519cc5830b6edc00c3c6f8ed2646ef4460850d3ebbfc25bad88c" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.822592 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-64frs"] Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.825271 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-64frs"] Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.829007 4722 scope.go:117] "RemoveContainer" containerID="83c9ec76be9f3502d89c676d78e714eeea9b0340976175aeadfd0dc3726f4500" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.832948 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vqqrf"] Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.839489 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vqqrf"] Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.852012 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rnljk"] Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.856191 4722 scope.go:117] "RemoveContainer" containerID="5ad81a5a39e1d2d4c131bcf5c486bacca24698453f66dd8aa32cd630c49e4b9c" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.859987 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rnljk"] Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.864065 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4gbkr"] Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.867654 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4gbkr"] Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.874537 4722 scope.go:117] "RemoveContainer" containerID="fed968269de56954a9bf853304185d7d7e89b05c7032995e1f8430c840f32748" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.889230 4722 scope.go:117] "RemoveContainer" containerID="b5c97b5b76e7afa24f8f93363368d20e4563b18ad7e8eaf0a0672fe76a243f0a" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.915800 4722 scope.go:117] "RemoveContainer" containerID="46fb6dc449baf9d204637234c7660e38bd2e8d2f352111d61b07600262a339ee" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.932021 4722 scope.go:117] "RemoveContainer" containerID="a2f518a60109d1ac4178243c5d97f899b29c7b0af31605dc637805b2a245c236" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.958235 4722 scope.go:117] "RemoveContainer" containerID="78d9b73635fb9fd918479e49197028103f67da7ed33002bbffe05da3a4ec4523" Feb 19 19:24:40 crc kubenswrapper[4722]: I0219 19:24:40.970911 4722 scope.go:117] "RemoveContainer" containerID="1816806692e1e38c26b8744d0c3544e4e5966028d1e6511ad4491cc5f00ba0fc" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.077514 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" path="/var/lib/kubelet/pods/0c9d3632-a132-4377-95ef-564cffb1f299/volumes" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.079428 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" path="/var/lib/kubelet/pods/2bb14baa-8bfc-415a-aa95-50b79f3c75ea/volumes" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.080190 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" path="/var/lib/kubelet/pods/396bbbdf-7f78-48e7-b02c-0737c221aaa6/volumes" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.082824 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb6886b7-9193-4c89-96c8-64b61c3251a4" path="/var/lib/kubelet/pods/cb6886b7-9193-4c89-96c8-64b61c3251a4/volumes" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.083380 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" path="/var/lib/kubelet/pods/f10dae1c-d938-4cce-893b-4ad7eca7d23f/volumes" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.419466 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vwrjw"] Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.419847 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerName="extract-utilities" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.419915 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerName="extract-utilities" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.419967 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerName="marketplace-operator" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.420019 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerName="marketplace-operator" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.420066 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerName="extract-content" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.420110 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerName="extract-content" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.420172 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerName="extract-content" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.420228 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerName="extract-content" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.420277 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerName="extract-utilities" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.420320 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerName="extract-utilities" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.420369 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" containerName="extract-utilities" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.420414 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" containerName="extract-utilities" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.420463 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.420511 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.420559 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" containerName="extract-content" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.420602 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" containerName="extract-content" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.420651 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.420695 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.420769 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.420936 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.421008 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerName="extract-utilities" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.421091 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerName="extract-utilities" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.421178 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.421230 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.421340 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerName="extract-content" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.421391 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerName="extract-content" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.421588 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerName="marketplace-operator" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.421664 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c9d3632-a132-4377-95ef-564cffb1f299" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.421734 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="396bbbdf-7f78-48e7-b02c-0737c221aaa6" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.421783 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb14baa-8bfc-415a-aa95-50b79f3c75ea" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.421849 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10dae1c-d938-4cce-893b-4ad7eca7d23f" containerName="registry-server" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.421895 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerName="marketplace-operator" Feb 19 19:24:41 crc kubenswrapper[4722]: E0219 19:24:41.422032 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerName="marketplace-operator" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.422095 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb6886b7-9193-4c89-96c8-64b61c3251a4" containerName="marketplace-operator" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.423519 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.425570 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.430783 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vwrjw"] Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.460972 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7vhl\" (UniqueName: \"kubernetes.io/projected/7a6ec43d-cefe-40ee-b41e-81dc96b88739-kube-api-access-z7vhl\") pod \"certified-operators-vwrjw\" (UID: \"7a6ec43d-cefe-40ee-b41e-81dc96b88739\") " pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.461036 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a6ec43d-cefe-40ee-b41e-81dc96b88739-utilities\") pod \"certified-operators-vwrjw\" (UID: \"7a6ec43d-cefe-40ee-b41e-81dc96b88739\") " pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.461069 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a6ec43d-cefe-40ee-b41e-81dc96b88739-catalog-content\") pod \"certified-operators-vwrjw\" (UID: \"7a6ec43d-cefe-40ee-b41e-81dc96b88739\") " pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.562274 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7vhl\" (UniqueName: \"kubernetes.io/projected/7a6ec43d-cefe-40ee-b41e-81dc96b88739-kube-api-access-z7vhl\") pod \"certified-operators-vwrjw\" (UID: \"7a6ec43d-cefe-40ee-b41e-81dc96b88739\") " pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.562333 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a6ec43d-cefe-40ee-b41e-81dc96b88739-utilities\") pod \"certified-operators-vwrjw\" (UID: \"7a6ec43d-cefe-40ee-b41e-81dc96b88739\") " pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.562363 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a6ec43d-cefe-40ee-b41e-81dc96b88739-catalog-content\") pod \"certified-operators-vwrjw\" (UID: \"7a6ec43d-cefe-40ee-b41e-81dc96b88739\") " pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.562978 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a6ec43d-cefe-40ee-b41e-81dc96b88739-catalog-content\") pod \"certified-operators-vwrjw\" (UID: \"7a6ec43d-cefe-40ee-b41e-81dc96b88739\") " pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.563104 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a6ec43d-cefe-40ee-b41e-81dc96b88739-utilities\") pod \"certified-operators-vwrjw\" (UID: \"7a6ec43d-cefe-40ee-b41e-81dc96b88739\") " pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.581012 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7vhl\" (UniqueName: \"kubernetes.io/projected/7a6ec43d-cefe-40ee-b41e-81dc96b88739-kube-api-access-z7vhl\") pod \"certified-operators-vwrjw\" (UID: \"7a6ec43d-cefe-40ee-b41e-81dc96b88739\") " pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.620421 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n2l4s"] Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.621655 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.623931 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.634202 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n2l4s"] Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.663539 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19cd1ff4-6442-47bc-8c68-679c1c19abce-catalog-content\") pod \"community-operators-n2l4s\" (UID: \"19cd1ff4-6442-47bc-8c68-679c1c19abce\") " pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.663802 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25rn8\" (UniqueName: \"kubernetes.io/projected/19cd1ff4-6442-47bc-8c68-679c1c19abce-kube-api-access-25rn8\") pod \"community-operators-n2l4s\" (UID: \"19cd1ff4-6442-47bc-8c68-679c1c19abce\") " pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.663886 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19cd1ff4-6442-47bc-8c68-679c1c19abce-utilities\") pod \"community-operators-n2l4s\" (UID: \"19cd1ff4-6442-47bc-8c68-679c1c19abce\") " pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.740796 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.764866 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19cd1ff4-6442-47bc-8c68-679c1c19abce-catalog-content\") pod \"community-operators-n2l4s\" (UID: \"19cd1ff4-6442-47bc-8c68-679c1c19abce\") " pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.764920 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25rn8\" (UniqueName: \"kubernetes.io/projected/19cd1ff4-6442-47bc-8c68-679c1c19abce-kube-api-access-25rn8\") pod \"community-operators-n2l4s\" (UID: \"19cd1ff4-6442-47bc-8c68-679c1c19abce\") " pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.764940 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19cd1ff4-6442-47bc-8c68-679c1c19abce-utilities\") pod \"community-operators-n2l4s\" (UID: \"19cd1ff4-6442-47bc-8c68-679c1c19abce\") " pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.766412 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19cd1ff4-6442-47bc-8c68-679c1c19abce-catalog-content\") pod \"community-operators-n2l4s\" (UID: \"19cd1ff4-6442-47bc-8c68-679c1c19abce\") " pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.766495 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19cd1ff4-6442-47bc-8c68-679c1c19abce-utilities\") pod \"community-operators-n2l4s\" (UID: \"19cd1ff4-6442-47bc-8c68-679c1c19abce\") " pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.783335 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25rn8\" (UniqueName: \"kubernetes.io/projected/19cd1ff4-6442-47bc-8c68-679c1c19abce-kube-api-access-25rn8\") pod \"community-operators-n2l4s\" (UID: \"19cd1ff4-6442-47bc-8c68-679c1c19abce\") " pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.788006 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-lrwfz" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.798514 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.798832 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:24:41 crc kubenswrapper[4722]: I0219 19:24:41.937938 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:42 crc kubenswrapper[4722]: I0219 19:24:42.084668 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n2l4s"] Feb 19 19:24:42 crc kubenswrapper[4722]: W0219 19:24:42.088091 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19cd1ff4_6442_47bc_8c68_679c1c19abce.slice/crio-cbd85b1b2f7e0f83d4e6aa8487d060900280e5296530b4b631e3d9d641fd7cb3 WatchSource:0}: Error finding container cbd85b1b2f7e0f83d4e6aa8487d060900280e5296530b4b631e3d9d641fd7cb3: Status 404 returned error can't find the container with id cbd85b1b2f7e0f83d4e6aa8487d060900280e5296530b4b631e3d9d641fd7cb3 Feb 19 19:24:42 crc kubenswrapper[4722]: I0219 19:24:42.153000 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vwrjw"] Feb 19 19:24:42 crc kubenswrapper[4722]: W0219 19:24:42.155459 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a6ec43d_cefe_40ee_b41e_81dc96b88739.slice/crio-5a14d70010b57d838a6fa62e420fcca1eb35b4723a013c2886c1b4f93dc794c5 WatchSource:0}: Error finding container 5a14d70010b57d838a6fa62e420fcca1eb35b4723a013c2886c1b4f93dc794c5: Status 404 returned error can't find the container with id 5a14d70010b57d838a6fa62e420fcca1eb35b4723a013c2886c1b4f93dc794c5 Feb 19 19:24:42 crc kubenswrapper[4722]: I0219 19:24:42.791888 4722 generic.go:334] "Generic (PLEG): container finished" podID="19cd1ff4-6442-47bc-8c68-679c1c19abce" containerID="6ec4885ad6409af62960f29e754821fb63e4679cc0c42272bb54d2c421104548" exitCode=0 Feb 19 19:24:42 crc kubenswrapper[4722]: I0219 19:24:42.791938 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2l4s" event={"ID":"19cd1ff4-6442-47bc-8c68-679c1c19abce","Type":"ContainerDied","Data":"6ec4885ad6409af62960f29e754821fb63e4679cc0c42272bb54d2c421104548"} Feb 19 19:24:42 crc kubenswrapper[4722]: I0219 19:24:42.791986 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2l4s" event={"ID":"19cd1ff4-6442-47bc-8c68-679c1c19abce","Type":"ContainerStarted","Data":"cbd85b1b2f7e0f83d4e6aa8487d060900280e5296530b4b631e3d9d641fd7cb3"} Feb 19 19:24:42 crc kubenswrapper[4722]: I0219 19:24:42.795709 4722 generic.go:334] "Generic (PLEG): container finished" podID="7a6ec43d-cefe-40ee-b41e-81dc96b88739" containerID="427e0d3de2b7d9f99099ae26aec2524de06631a7e6bc9daaf68410c29a5ab986" exitCode=0 Feb 19 19:24:42 crc kubenswrapper[4722]: I0219 19:24:42.795795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwrjw" event={"ID":"7a6ec43d-cefe-40ee-b41e-81dc96b88739","Type":"ContainerDied","Data":"427e0d3de2b7d9f99099ae26aec2524de06631a7e6bc9daaf68410c29a5ab986"} Feb 19 19:24:42 crc kubenswrapper[4722]: I0219 19:24:42.795873 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwrjw" event={"ID":"7a6ec43d-cefe-40ee-b41e-81dc96b88739","Type":"ContainerStarted","Data":"5a14d70010b57d838a6fa62e420fcca1eb35b4723a013c2886c1b4f93dc794c5"} Feb 19 19:24:43 crc kubenswrapper[4722]: I0219 19:24:43.802207 4722 generic.go:334] "Generic (PLEG): container finished" podID="7a6ec43d-cefe-40ee-b41e-81dc96b88739" containerID="d2613d4ca6b645c54940dcb203608277edb114bdea813d90e87d749173b71b1f" exitCode=0 Feb 19 19:24:43 crc kubenswrapper[4722]: I0219 19:24:43.802292 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwrjw" event={"ID":"7a6ec43d-cefe-40ee-b41e-81dc96b88739","Type":"ContainerDied","Data":"d2613d4ca6b645c54940dcb203608277edb114bdea813d90e87d749173b71b1f"} Feb 19 19:24:43 crc kubenswrapper[4722]: I0219 19:24:43.806913 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2l4s" event={"ID":"19cd1ff4-6442-47bc-8c68-679c1c19abce","Type":"ContainerStarted","Data":"e9dc8c766c83abf6467a10db880aa2e1401ae3aaf2dbf53939376ab83fb22261"} Feb 19 19:24:43 crc kubenswrapper[4722]: I0219 19:24:43.814117 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xhpzr"] Feb 19 19:24:43 crc kubenswrapper[4722]: I0219 19:24:43.815363 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:43 crc kubenswrapper[4722]: I0219 19:24:43.816979 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 19:24:43 crc kubenswrapper[4722]: I0219 19:24:43.824398 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhpzr"] Feb 19 19:24:43 crc kubenswrapper[4722]: I0219 19:24:43.995021 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277ec436-8032-4711-8573-5b2eaab8f212-utilities\") pod \"redhat-marketplace-xhpzr\" (UID: \"277ec436-8032-4711-8573-5b2eaab8f212\") " pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:43 crc kubenswrapper[4722]: I0219 19:24:43.995101 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d9x8\" (UniqueName: \"kubernetes.io/projected/277ec436-8032-4711-8573-5b2eaab8f212-kube-api-access-2d9x8\") pod \"redhat-marketplace-xhpzr\" (UID: \"277ec436-8032-4711-8573-5b2eaab8f212\") " pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:43 crc kubenswrapper[4722]: I0219 19:24:43.995246 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277ec436-8032-4711-8573-5b2eaab8f212-catalog-content\") pod \"redhat-marketplace-xhpzr\" (UID: \"277ec436-8032-4711-8573-5b2eaab8f212\") " pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.020010 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tr77s"] Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.021206 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.023117 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.032375 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tr77s"] Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.096874 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a704e2d3-bed1-47a6-a2d1-af2c3583e06c-catalog-content\") pod \"redhat-operators-tr77s\" (UID: \"a704e2d3-bed1-47a6-a2d1-af2c3583e06c\") " pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.096928 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277ec436-8032-4711-8573-5b2eaab8f212-utilities\") pod \"redhat-marketplace-xhpzr\" (UID: \"277ec436-8032-4711-8573-5b2eaab8f212\") " pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.096962 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw87j\" (UniqueName: \"kubernetes.io/projected/a704e2d3-bed1-47a6-a2d1-af2c3583e06c-kube-api-access-zw87j\") pod \"redhat-operators-tr77s\" (UID: \"a704e2d3-bed1-47a6-a2d1-af2c3583e06c\") " pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.096983 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d9x8\" (UniqueName: \"kubernetes.io/projected/277ec436-8032-4711-8573-5b2eaab8f212-kube-api-access-2d9x8\") pod \"redhat-marketplace-xhpzr\" (UID: \"277ec436-8032-4711-8573-5b2eaab8f212\") " pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.097010 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277ec436-8032-4711-8573-5b2eaab8f212-catalog-content\") pod \"redhat-marketplace-xhpzr\" (UID: \"277ec436-8032-4711-8573-5b2eaab8f212\") " pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.097120 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a704e2d3-bed1-47a6-a2d1-af2c3583e06c-utilities\") pod \"redhat-operators-tr77s\" (UID: \"a704e2d3-bed1-47a6-a2d1-af2c3583e06c\") " pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.097425 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/277ec436-8032-4711-8573-5b2eaab8f212-catalog-content\") pod \"redhat-marketplace-xhpzr\" (UID: \"277ec436-8032-4711-8573-5b2eaab8f212\") " pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.097735 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/277ec436-8032-4711-8573-5b2eaab8f212-utilities\") pod \"redhat-marketplace-xhpzr\" (UID: \"277ec436-8032-4711-8573-5b2eaab8f212\") " pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.116220 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d9x8\" (UniqueName: \"kubernetes.io/projected/277ec436-8032-4711-8573-5b2eaab8f212-kube-api-access-2d9x8\") pod \"redhat-marketplace-xhpzr\" (UID: \"277ec436-8032-4711-8573-5b2eaab8f212\") " pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.141065 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.197825 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw87j\" (UniqueName: \"kubernetes.io/projected/a704e2d3-bed1-47a6-a2d1-af2c3583e06c-kube-api-access-zw87j\") pod \"redhat-operators-tr77s\" (UID: \"a704e2d3-bed1-47a6-a2d1-af2c3583e06c\") " pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.197898 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a704e2d3-bed1-47a6-a2d1-af2c3583e06c-utilities\") pod \"redhat-operators-tr77s\" (UID: \"a704e2d3-bed1-47a6-a2d1-af2c3583e06c\") " pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.197935 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a704e2d3-bed1-47a6-a2d1-af2c3583e06c-catalog-content\") pod \"redhat-operators-tr77s\" (UID: \"a704e2d3-bed1-47a6-a2d1-af2c3583e06c\") " pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.198385 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a704e2d3-bed1-47a6-a2d1-af2c3583e06c-catalog-content\") pod \"redhat-operators-tr77s\" (UID: \"a704e2d3-bed1-47a6-a2d1-af2c3583e06c\") " pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.198587 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a704e2d3-bed1-47a6-a2d1-af2c3583e06c-utilities\") pod \"redhat-operators-tr77s\" (UID: \"a704e2d3-bed1-47a6-a2d1-af2c3583e06c\") " pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.222936 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw87j\" (UniqueName: \"kubernetes.io/projected/a704e2d3-bed1-47a6-a2d1-af2c3583e06c-kube-api-access-zw87j\") pod \"redhat-operators-tr77s\" (UID: \"a704e2d3-bed1-47a6-a2d1-af2c3583e06c\") " pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.329703 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xhpzr"] Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.369866 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.555969 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tr77s"] Feb 19 19:24:44 crc kubenswrapper[4722]: W0219 19:24:44.577679 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda704e2d3_bed1_47a6_a2d1_af2c3583e06c.slice/crio-61b9a2a5fea71e7d4ebfb925df25f41b7746c58e230db39e7b51459deb47946c WatchSource:0}: Error finding container 61b9a2a5fea71e7d4ebfb925df25f41b7746c58e230db39e7b51459deb47946c: Status 404 returned error can't find the container with id 61b9a2a5fea71e7d4ebfb925df25f41b7746c58e230db39e7b51459deb47946c Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.813814 4722 generic.go:334] "Generic (PLEG): container finished" podID="277ec436-8032-4711-8573-5b2eaab8f212" containerID="8c0260ca0f7f0558a17eda8de6ad6a4bd513d7b35c8a2898c91107ecdd18a35a" exitCode=0 Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.813928 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhpzr" event={"ID":"277ec436-8032-4711-8573-5b2eaab8f212","Type":"ContainerDied","Data":"8c0260ca0f7f0558a17eda8de6ad6a4bd513d7b35c8a2898c91107ecdd18a35a"} Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.813968 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhpzr" event={"ID":"277ec436-8032-4711-8573-5b2eaab8f212","Type":"ContainerStarted","Data":"47f928d595f73ddfa8a4368ef0aa8ba0590b597c5f910a8dd66bc9eae2da0931"} Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.816956 4722 generic.go:334] "Generic (PLEG): container finished" podID="19cd1ff4-6442-47bc-8c68-679c1c19abce" containerID="e9dc8c766c83abf6467a10db880aa2e1401ae3aaf2dbf53939376ab83fb22261" exitCode=0 Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.817009 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2l4s" event={"ID":"19cd1ff4-6442-47bc-8c68-679c1c19abce","Type":"ContainerDied","Data":"e9dc8c766c83abf6467a10db880aa2e1401ae3aaf2dbf53939376ab83fb22261"} Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.819688 4722 generic.go:334] "Generic (PLEG): container finished" podID="a704e2d3-bed1-47a6-a2d1-af2c3583e06c" containerID="bf6d642b3d56af7be565e8c58abeae3b7dcd307a2e1180c940896739bdc1908c" exitCode=0 Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.819752 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr77s" event={"ID":"a704e2d3-bed1-47a6-a2d1-af2c3583e06c","Type":"ContainerDied","Data":"bf6d642b3d56af7be565e8c58abeae3b7dcd307a2e1180c940896739bdc1908c"} Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.819773 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr77s" event={"ID":"a704e2d3-bed1-47a6-a2d1-af2c3583e06c","Type":"ContainerStarted","Data":"61b9a2a5fea71e7d4ebfb925df25f41b7746c58e230db39e7b51459deb47946c"} Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.823538 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vwrjw" event={"ID":"7a6ec43d-cefe-40ee-b41e-81dc96b88739","Type":"ContainerStarted","Data":"ebf0ed52b31c6e468bcd90184c245b728e44bc4262736fa3b1379a35674a2720"} Feb 19 19:24:44 crc kubenswrapper[4722]: I0219 19:24:44.884213 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vwrjw" podStartSLOduration=2.439384419 podStartE2EDuration="3.884187748s" podCreationTimestamp="2026-02-19 19:24:41 +0000 UTC" firstStartedPulling="2026-02-19 19:24:42.797976337 +0000 UTC m=+382.410326661" lastFinishedPulling="2026-02-19 19:24:44.242779666 +0000 UTC m=+383.855129990" observedRunningTime="2026-02-19 19:24:44.883114634 +0000 UTC m=+384.495464958" watchObservedRunningTime="2026-02-19 19:24:44.884187748 +0000 UTC m=+384.496538082" Feb 19 19:24:45 crc kubenswrapper[4722]: I0219 19:24:45.829415 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n2l4s" event={"ID":"19cd1ff4-6442-47bc-8c68-679c1c19abce","Type":"ContainerStarted","Data":"63efecbfdbce9ed69369473ff3c9ea7e712f6279d7c1b8296d2b5455bf799f93"} Feb 19 19:24:45 crc kubenswrapper[4722]: I0219 19:24:45.831195 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr77s" event={"ID":"a704e2d3-bed1-47a6-a2d1-af2c3583e06c","Type":"ContainerStarted","Data":"2cbef89681851c35b3099021254ae070e04a5a0e215e89780d4ea53d5f2c05be"} Feb 19 19:24:45 crc kubenswrapper[4722]: I0219 19:24:45.832656 4722 generic.go:334] "Generic (PLEG): container finished" podID="277ec436-8032-4711-8573-5b2eaab8f212" containerID="0cbf7ecd5e0f5b3dbc145a34bb9d664402290a42de9d6983be300359ba8d4274" exitCode=0 Feb 19 19:24:45 crc kubenswrapper[4722]: I0219 19:24:45.832721 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhpzr" event={"ID":"277ec436-8032-4711-8573-5b2eaab8f212","Type":"ContainerDied","Data":"0cbf7ecd5e0f5b3dbc145a34bb9d664402290a42de9d6983be300359ba8d4274"} Feb 19 19:24:45 crc kubenswrapper[4722]: I0219 19:24:45.847410 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n2l4s" podStartSLOduration=2.325751633 podStartE2EDuration="4.847391414s" podCreationTimestamp="2026-02-19 19:24:41 +0000 UTC" firstStartedPulling="2026-02-19 19:24:42.793426726 +0000 UTC m=+382.405777050" lastFinishedPulling="2026-02-19 19:24:45.315066507 +0000 UTC m=+384.927416831" observedRunningTime="2026-02-19 19:24:45.84403018 +0000 UTC m=+385.456380504" watchObservedRunningTime="2026-02-19 19:24:45.847391414 +0000 UTC m=+385.459741738" Feb 19 19:24:46 crc kubenswrapper[4722]: I0219 19:24:46.846727 4722 generic.go:334] "Generic (PLEG): container finished" podID="a704e2d3-bed1-47a6-a2d1-af2c3583e06c" containerID="2cbef89681851c35b3099021254ae070e04a5a0e215e89780d4ea53d5f2c05be" exitCode=0 Feb 19 19:24:46 crc kubenswrapper[4722]: I0219 19:24:46.846827 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr77s" event={"ID":"a704e2d3-bed1-47a6-a2d1-af2c3583e06c","Type":"ContainerDied","Data":"2cbef89681851c35b3099021254ae070e04a5a0e215e89780d4ea53d5f2c05be"} Feb 19 19:24:46 crc kubenswrapper[4722]: I0219 19:24:46.859654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xhpzr" event={"ID":"277ec436-8032-4711-8573-5b2eaab8f212","Type":"ContainerStarted","Data":"868af1b6814a812654e403eafe22dece8d3939db37b70dad98e25cd53360d0ca"} Feb 19 19:24:46 crc kubenswrapper[4722]: I0219 19:24:46.883822 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xhpzr" podStartSLOduration=2.421865899 podStartE2EDuration="3.883801347s" podCreationTimestamp="2026-02-19 19:24:43 +0000 UTC" firstStartedPulling="2026-02-19 19:24:44.815368535 +0000 UTC m=+384.427718859" lastFinishedPulling="2026-02-19 19:24:46.277303993 +0000 UTC m=+385.889654307" observedRunningTime="2026-02-19 19:24:46.880532187 +0000 UTC m=+386.492882511" watchObservedRunningTime="2026-02-19 19:24:46.883801347 +0000 UTC m=+386.496151681" Feb 19 19:24:47 crc kubenswrapper[4722]: I0219 19:24:47.866786 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tr77s" event={"ID":"a704e2d3-bed1-47a6-a2d1-af2c3583e06c","Type":"ContainerStarted","Data":"2cad19f1c16d165313021fd5d803db4280cb827de7bc5fe079236904879e4326"} Feb 19 19:24:47 crc kubenswrapper[4722]: I0219 19:24:47.889424 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tr77s" podStartSLOduration=1.4606881170000001 podStartE2EDuration="3.889404741s" podCreationTimestamp="2026-02-19 19:24:44 +0000 UTC" firstStartedPulling="2026-02-19 19:24:44.822121303 +0000 UTC m=+384.434471647" lastFinishedPulling="2026-02-19 19:24:47.250837947 +0000 UTC m=+386.863188271" observedRunningTime="2026-02-19 19:24:47.88676159 +0000 UTC m=+387.499111924" watchObservedRunningTime="2026-02-19 19:24:47.889404741 +0000 UTC m=+387.501755065" Feb 19 19:24:51 crc kubenswrapper[4722]: I0219 19:24:51.740931 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:51 crc kubenswrapper[4722]: I0219 19:24:51.741653 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:51 crc kubenswrapper[4722]: I0219 19:24:51.790605 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:51 crc kubenswrapper[4722]: I0219 19:24:51.922690 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vwrjw" Feb 19 19:24:51 crc kubenswrapper[4722]: I0219 19:24:51.938680 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:51 crc kubenswrapper[4722]: I0219 19:24:51.938746 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:51 crc kubenswrapper[4722]: I0219 19:24:51.981915 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:52 crc kubenswrapper[4722]: I0219 19:24:52.923451 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n2l4s" Feb 19 19:24:54 crc kubenswrapper[4722]: I0219 19:24:54.142197 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:54 crc kubenswrapper[4722]: I0219 19:24:54.142723 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:54 crc kubenswrapper[4722]: I0219 19:24:54.198746 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:54 crc kubenswrapper[4722]: I0219 19:24:54.370202 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:54 crc kubenswrapper[4722]: I0219 19:24:54.370262 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:54 crc kubenswrapper[4722]: I0219 19:24:54.416460 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:24:54 crc kubenswrapper[4722]: I0219 19:24:54.944245 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xhpzr" Feb 19 19:24:54 crc kubenswrapper[4722]: I0219 19:24:54.946856 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tr77s" Feb 19 19:25:03 crc kubenswrapper[4722]: I0219 19:25:03.753764 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" podUID="8d31d88d-2e34-4b55-b843-b8a67b957680" containerName="registry" containerID="cri-o://b11f548bfd17279778f42b1ce10841b0e20ec850d16175454f0810e6fc866fd8" gracePeriod=30 Feb 19 19:25:03 crc kubenswrapper[4722]: I0219 19:25:03.948049 4722 generic.go:334] "Generic (PLEG): container finished" podID="8d31d88d-2e34-4b55-b843-b8a67b957680" containerID="b11f548bfd17279778f42b1ce10841b0e20ec850d16175454f0810e6fc866fd8" exitCode=0 Feb 19 19:25:03 crc kubenswrapper[4722]: I0219 19:25:03.948099 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" event={"ID":"8d31d88d-2e34-4b55-b843-b8a67b957680","Type":"ContainerDied","Data":"b11f548bfd17279778f42b1ce10841b0e20ec850d16175454f0810e6fc866fd8"} Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.150606 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.173252 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8d31d88d-2e34-4b55-b843-b8a67b957680\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.173303 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-certificates\") pod \"8d31d88d-2e34-4b55-b843-b8a67b957680\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.173324 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8d31d88d-2e34-4b55-b843-b8a67b957680-installation-pull-secrets\") pod \"8d31d88d-2e34-4b55-b843-b8a67b957680\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.173363 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2csz\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-kube-api-access-d2csz\") pod \"8d31d88d-2e34-4b55-b843-b8a67b957680\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.173413 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-tls\") pod \"8d31d88d-2e34-4b55-b843-b8a67b957680\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.173459 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-trusted-ca\") pod \"8d31d88d-2e34-4b55-b843-b8a67b957680\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.173483 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-bound-sa-token\") pod \"8d31d88d-2e34-4b55-b843-b8a67b957680\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.173541 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8d31d88d-2e34-4b55-b843-b8a67b957680-ca-trust-extracted\") pod \"8d31d88d-2e34-4b55-b843-b8a67b957680\" (UID: \"8d31d88d-2e34-4b55-b843-b8a67b957680\") " Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.175524 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8d31d88d-2e34-4b55-b843-b8a67b957680" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.175536 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8d31d88d-2e34-4b55-b843-b8a67b957680" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.187566 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-kube-api-access-d2csz" (OuterVolumeSpecName: "kube-api-access-d2csz") pod "8d31d88d-2e34-4b55-b843-b8a67b957680" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680"). InnerVolumeSpecName "kube-api-access-d2csz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.189357 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8d31d88d-2e34-4b55-b843-b8a67b957680" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.192648 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "8d31d88d-2e34-4b55-b843-b8a67b957680" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.199284 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d31d88d-2e34-4b55-b843-b8a67b957680-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8d31d88d-2e34-4b55-b843-b8a67b957680" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.199671 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d31d88d-2e34-4b55-b843-b8a67b957680-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8d31d88d-2e34-4b55-b843-b8a67b957680" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.200487 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8d31d88d-2e34-4b55-b843-b8a67b957680" (UID: "8d31d88d-2e34-4b55-b843-b8a67b957680"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.274341 4722 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.274382 4722 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8d31d88d-2e34-4b55-b843-b8a67b957680-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.274394 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2csz\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-kube-api-access-d2csz\") on node \"crc\" DevicePath \"\"" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.274404 4722 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.274415 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d31d88d-2e34-4b55-b843-b8a67b957680-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.274422 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d31d88d-2e34-4b55-b843-b8a67b957680-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.274430 4722 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8d31d88d-2e34-4b55-b843-b8a67b957680-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.956250 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" event={"ID":"8d31d88d-2e34-4b55-b843-b8a67b957680","Type":"ContainerDied","Data":"7fc589c7d609f9f8ea97795796aadbb293f365ba97d8b385ba4c6ea2f33eb413"} Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.956334 4722 scope.go:117] "RemoveContainer" containerID="b11f548bfd17279778f42b1ce10841b0e20ec850d16175454f0810e6fc866fd8" Feb 19 19:25:04 crc kubenswrapper[4722]: I0219 19:25:04.956332 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k6bqq" Feb 19 19:25:05 crc kubenswrapper[4722]: I0219 19:25:05.013012 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6bqq"] Feb 19 19:25:05 crc kubenswrapper[4722]: I0219 19:25:05.019328 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k6bqq"] Feb 19 19:25:05 crc kubenswrapper[4722]: I0219 19:25:05.076529 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d31d88d-2e34-4b55-b843-b8a67b957680" path="/var/lib/kubelet/pods/8d31d88d-2e34-4b55-b843-b8a67b957680/volumes" Feb 19 19:25:11 crc kubenswrapper[4722]: I0219 19:25:11.798368 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:25:11 crc kubenswrapper[4722]: I0219 19:25:11.798707 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:25:11 crc kubenswrapper[4722]: I0219 19:25:11.798747 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:25:11 crc kubenswrapper[4722]: I0219 19:25:11.799383 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"793b4919ac9772a89f95b2b76957a7ffe6ea089b9abb948aa9c7330908d0f312"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:25:11 crc kubenswrapper[4722]: I0219 19:25:11.799438 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://793b4919ac9772a89f95b2b76957a7ffe6ea089b9abb948aa9c7330908d0f312" gracePeriod=600 Feb 19 19:25:12 crc kubenswrapper[4722]: I0219 19:25:12.008204 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="793b4919ac9772a89f95b2b76957a7ffe6ea089b9abb948aa9c7330908d0f312" exitCode=0 Feb 19 19:25:12 crc kubenswrapper[4722]: I0219 19:25:12.008300 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"793b4919ac9772a89f95b2b76957a7ffe6ea089b9abb948aa9c7330908d0f312"} Feb 19 19:25:12 crc kubenswrapper[4722]: I0219 19:25:12.008485 4722 scope.go:117] "RemoveContainer" containerID="dcfb3546b07a9f33842eb5ef331961ffa59d15fcb98b5479b8867f8dd667782d" Feb 19 19:25:13 crc kubenswrapper[4722]: I0219 19:25:13.018640 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"ed4098cbee7574ff3d9c55b78db4cadcd44467488f62dc621d61b36a474cc23c"} Feb 19 19:27:41 crc kubenswrapper[4722]: I0219 19:27:41.798429 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:27:41 crc kubenswrapper[4722]: I0219 19:27:41.799358 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:28:11 crc kubenswrapper[4722]: I0219 19:28:11.798778 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:28:11 crc kubenswrapper[4722]: I0219 19:28:11.799499 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:28:41 crc kubenswrapper[4722]: I0219 19:28:41.798100 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:28:41 crc kubenswrapper[4722]: I0219 19:28:41.798807 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:28:41 crc kubenswrapper[4722]: I0219 19:28:41.798878 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:28:41 crc kubenswrapper[4722]: I0219 19:28:41.799681 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ed4098cbee7574ff3d9c55b78db4cadcd44467488f62dc621d61b36a474cc23c"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:28:41 crc kubenswrapper[4722]: I0219 19:28:41.799762 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://ed4098cbee7574ff3d9c55b78db4cadcd44467488f62dc621d61b36a474cc23c" gracePeriod=600 Feb 19 19:28:41 crc kubenswrapper[4722]: I0219 19:28:41.941823 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="ed4098cbee7574ff3d9c55b78db4cadcd44467488f62dc621d61b36a474cc23c" exitCode=0 Feb 19 19:28:41 crc kubenswrapper[4722]: I0219 19:28:41.941891 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"ed4098cbee7574ff3d9c55b78db4cadcd44467488f62dc621d61b36a474cc23c"} Feb 19 19:28:41 crc kubenswrapper[4722]: I0219 19:28:41.942433 4722 scope.go:117] "RemoveContainer" containerID="793b4919ac9772a89f95b2b76957a7ffe6ea089b9abb948aa9c7330908d0f312" Feb 19 19:28:42 crc kubenswrapper[4722]: I0219 19:28:42.952394 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"66078169c6e38cc91acddc273dfade3d624308d325857d7f5a0c20b40b5ebc84"} Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.492787 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m"] Feb 19 19:29:33 crc kubenswrapper[4722]: E0219 19:29:33.493590 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d31d88d-2e34-4b55-b843-b8a67b957680" containerName="registry" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.493604 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d31d88d-2e34-4b55-b843-b8a67b957680" containerName="registry" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.493721 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d31d88d-2e34-4b55-b843-b8a67b957680" containerName="registry" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.494543 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.496439 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.509451 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m"] Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.656396 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.656494 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-847tq\" (UniqueName: \"kubernetes.io/projected/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-kube-api-access-847tq\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.656581 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.757606 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.757715 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-847tq\" (UniqueName: \"kubernetes.io/projected/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-kube-api-access-847tq\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.757804 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.758292 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.758468 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.780277 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-847tq\" (UniqueName: \"kubernetes.io/projected/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-kube-api-access-847tq\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:33 crc kubenswrapper[4722]: I0219 19:29:33.812331 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:34 crc kubenswrapper[4722]: I0219 19:29:34.208584 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m"] Feb 19 19:29:34 crc kubenswrapper[4722]: I0219 19:29:34.266912 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" event={"ID":"ae31c080-c2a8-484e-9d6a-bd55ca4ae533","Type":"ContainerStarted","Data":"1308967f71d0da97d8a17acc24d6fc36d84d34da1a47983f745a8e51d759a0d1"} Feb 19 19:29:35 crc kubenswrapper[4722]: I0219 19:29:35.271896 4722 generic.go:334] "Generic (PLEG): container finished" podID="ae31c080-c2a8-484e-9d6a-bd55ca4ae533" containerID="05a15e1754c52e72e4ad935cf4ccc48bc64ecd37b25378cce07f5da46eaa3ad8" exitCode=0 Feb 19 19:29:35 crc kubenswrapper[4722]: I0219 19:29:35.271998 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" event={"ID":"ae31c080-c2a8-484e-9d6a-bd55ca4ae533","Type":"ContainerDied","Data":"05a15e1754c52e72e4ad935cf4ccc48bc64ecd37b25378cce07f5da46eaa3ad8"} Feb 19 19:29:35 crc kubenswrapper[4722]: I0219 19:29:35.273202 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:29:37 crc kubenswrapper[4722]: I0219 19:29:37.285999 4722 generic.go:334] "Generic (PLEG): container finished" podID="ae31c080-c2a8-484e-9d6a-bd55ca4ae533" containerID="bc043e9398d53a46c65eb4ada63ed6b08dc67bdc0122b2f97f793a5d9f61963c" exitCode=0 Feb 19 19:29:37 crc kubenswrapper[4722]: I0219 19:29:37.286116 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" event={"ID":"ae31c080-c2a8-484e-9d6a-bd55ca4ae533","Type":"ContainerDied","Data":"bc043e9398d53a46c65eb4ada63ed6b08dc67bdc0122b2f97f793a5d9f61963c"} Feb 19 19:29:38 crc kubenswrapper[4722]: I0219 19:29:38.292845 4722 generic.go:334] "Generic (PLEG): container finished" podID="ae31c080-c2a8-484e-9d6a-bd55ca4ae533" containerID="ab5dc2d8e8e2ea2125bcadf938b8bd230c98a434bfab23a5895f1114319d2456" exitCode=0 Feb 19 19:29:38 crc kubenswrapper[4722]: I0219 19:29:38.292891 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" event={"ID":"ae31c080-c2a8-484e-9d6a-bd55ca4ae533","Type":"ContainerDied","Data":"ab5dc2d8e8e2ea2125bcadf938b8bd230c98a434bfab23a5895f1114319d2456"} Feb 19 19:29:39 crc kubenswrapper[4722]: I0219 19:29:39.530624 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:39 crc kubenswrapper[4722]: I0219 19:29:39.635388 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-bundle\") pod \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " Feb 19 19:29:39 crc kubenswrapper[4722]: I0219 19:29:39.635454 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-util\") pod \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " Feb 19 19:29:39 crc kubenswrapper[4722]: I0219 19:29:39.635518 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-847tq\" (UniqueName: \"kubernetes.io/projected/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-kube-api-access-847tq\") pod \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\" (UID: \"ae31c080-c2a8-484e-9d6a-bd55ca4ae533\") " Feb 19 19:29:39 crc kubenswrapper[4722]: I0219 19:29:39.640055 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-bundle" (OuterVolumeSpecName: "bundle") pod "ae31c080-c2a8-484e-9d6a-bd55ca4ae533" (UID: "ae31c080-c2a8-484e-9d6a-bd55ca4ae533"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:29:39 crc kubenswrapper[4722]: I0219 19:29:39.647026 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-kube-api-access-847tq" (OuterVolumeSpecName: "kube-api-access-847tq") pod "ae31c080-c2a8-484e-9d6a-bd55ca4ae533" (UID: "ae31c080-c2a8-484e-9d6a-bd55ca4ae533"). InnerVolumeSpecName "kube-api-access-847tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:29:39 crc kubenswrapper[4722]: I0219 19:29:39.648280 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:39 crc kubenswrapper[4722]: I0219 19:29:39.648376 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-847tq\" (UniqueName: \"kubernetes.io/projected/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-kube-api-access-847tq\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:39 crc kubenswrapper[4722]: I0219 19:29:39.717613 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-util" (OuterVolumeSpecName: "util") pod "ae31c080-c2a8-484e-9d6a-bd55ca4ae533" (UID: "ae31c080-c2a8-484e-9d6a-bd55ca4ae533"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:29:39 crc kubenswrapper[4722]: I0219 19:29:39.749672 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ae31c080-c2a8-484e-9d6a-bd55ca4ae533-util\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:40 crc kubenswrapper[4722]: I0219 19:29:40.305486 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" event={"ID":"ae31c080-c2a8-484e-9d6a-bd55ca4ae533","Type":"ContainerDied","Data":"1308967f71d0da97d8a17acc24d6fc36d84d34da1a47983f745a8e51d759a0d1"} Feb 19 19:29:40 crc kubenswrapper[4722]: I0219 19:29:40.305541 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1308967f71d0da97d8a17acc24d6fc36d84d34da1a47983f745a8e51d759a0d1" Feb 19 19:29:40 crc kubenswrapper[4722]: I0219 19:29:40.305562 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m" Feb 19 19:29:44 crc kubenswrapper[4722]: I0219 19:29:44.858497 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vsfln"] Feb 19 19:29:44 crc kubenswrapper[4722]: I0219 19:29:44.859112 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovn-controller" containerID="cri-o://f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9" gracePeriod=30 Feb 19 19:29:44 crc kubenswrapper[4722]: I0219 19:29:44.859173 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="northd" containerID="cri-o://2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627" gracePeriod=30 Feb 19 19:29:44 crc kubenswrapper[4722]: I0219 19:29:44.859231 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovn-acl-logging" containerID="cri-o://5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700" gracePeriod=30 Feb 19 19:29:44 crc kubenswrapper[4722]: I0219 19:29:44.859236 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="kube-rbac-proxy-node" containerID="cri-o://e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce" gracePeriod=30 Feb 19 19:29:44 crc kubenswrapper[4722]: I0219 19:29:44.859286 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10" gracePeriod=30 Feb 19 19:29:44 crc kubenswrapper[4722]: I0219 19:29:44.859284 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="sbdb" containerID="cri-o://4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96" gracePeriod=30 Feb 19 19:29:44 crc kubenswrapper[4722]: I0219 19:29:44.859233 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="nbdb" containerID="cri-o://8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd" gracePeriod=30 Feb 19 19:29:44 crc kubenswrapper[4722]: I0219 19:29:44.892013 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" containerID="cri-o://c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d" gracePeriod=30 Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.225087 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/3.log" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.228100 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovn-acl-logging/0.log" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.228802 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovn-controller/0.log" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.229380 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321062 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjr2p\" (UniqueName: \"kubernetes.io/projected/5eb7c404-f96e-43a7-b20f-b45d856c75a5-kube-api-access-zjr2p\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321126 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-systemd-units\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321203 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-kubelet\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321228 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321228 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321254 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-netns\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321273 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321289 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-env-overrides\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321288 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321305 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-etc-openvswitch\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321313 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321322 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-config\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321335 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321337 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-ovn-kubernetes\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321371 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-log-socket\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321385 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-node-log\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321402 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-netd\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321495 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321492 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-log-socket" (OuterVolumeSpecName: "log-socket") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321547 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-var-lib-openvswitch\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321544 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-node-log" (OuterVolumeSpecName: "node-log") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321563 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-openvswitch\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321561 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321590 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321601 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321632 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321655 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovn-node-metrics-cert\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321689 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-script-lib\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321660 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321717 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-slash\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321745 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-slash" (OuterVolumeSpecName: "host-slash") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321800 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-bin\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321827 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-ovn\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.321848 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-systemd\") pod \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\" (UID: \"5eb7c404-f96e-43a7-b20f-b45d856c75a5\") " Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322116 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322143 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322177 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322190 4722 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322207 4722 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322221 4722 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322233 4722 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322245 4722 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322257 4722 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322268 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322306 4722 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322319 4722 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-log-socket\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322329 4722 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-node-log\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322339 4722 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322349 4722 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322362 4722 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322372 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.322383 4722 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-slash\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.332347 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eb7c404-f96e-43a7-b20f-b45d856c75a5-kube-api-access-zjr2p" (OuterVolumeSpecName: "kube-api-access-zjr2p") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "kube-api-access-zjr2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.334464 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.338688 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovnkube-controller/3.log" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.342659 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "5eb7c404-f96e-43a7-b20f-b45d856c75a5" (UID: "5eb7c404-f96e-43a7-b20f-b45d856c75a5"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.350214 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovn-acl-logging/0.log" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.350659 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsfln_5eb7c404-f96e-43a7-b20f-b45d856c75a5/ovn-controller/0.log" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.350978 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d" exitCode=0 Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351001 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96" exitCode=0 Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351009 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd" exitCode=0 Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351016 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627" exitCode=0 Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351022 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10" exitCode=0 Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351029 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce" exitCode=0 Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351036 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700" exitCode=143 Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351043 4722 generic.go:334] "Generic (PLEG): container finished" podID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerID="f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9" exitCode=143 Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351081 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351113 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351123 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351132 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351141 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351162 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351173 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351183 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351188 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351195 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351202 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351209 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351215 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351223 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351228 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351235 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351243 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351248 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351254 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351259 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351264 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351269 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351274 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351279 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351284 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351288 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351296 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351305 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351311 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351318 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351323 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351328 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351333 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351338 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351343 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351348 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351353 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351360 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" event={"ID":"5eb7c404-f96e-43a7-b20f-b45d856c75a5","Type":"ContainerDied","Data":"80457ad8997939dc8e0991d051b5ca049affdba095f79270711bc1380ced8db4"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351367 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351372 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351379 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351384 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351389 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351394 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351399 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351405 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351410 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351415 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351429 4722 scope.go:117] "RemoveContainer" containerID="c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.351563 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vsfln" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359083 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tmh6g"] Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359286 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359298 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359308 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359315 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359323 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovn-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359329 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovn-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359337 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="northd" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359342 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="northd" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359350 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="sbdb" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359356 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="sbdb" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359363 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359370 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359380 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae31c080-c2a8-484e-9d6a-bd55ca4ae533" containerName="util" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359386 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae31c080-c2a8-484e-9d6a-bd55ca4ae533" containerName="util" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359396 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovn-acl-logging" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359403 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovn-acl-logging" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359412 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="kubecfg-setup" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359418 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="kubecfg-setup" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359423 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="kube-rbac-proxy-node" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359429 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="kube-rbac-proxy-node" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359439 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359445 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359452 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae31c080-c2a8-484e-9d6a-bd55ca4ae533" containerName="extract" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359458 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae31c080-c2a8-484e-9d6a-bd55ca4ae533" containerName="extract" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359467 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="nbdb" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359473 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="nbdb" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359482 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae31c080-c2a8-484e-9d6a-bd55ca4ae533" containerName="pull" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359487 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae31c080-c2a8-484e-9d6a-bd55ca4ae533" containerName="pull" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359494 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359499 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359579 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="nbdb" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359589 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovn-acl-logging" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359598 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359606 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359615 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae31c080-c2a8-484e-9d6a-bd55ca4ae533" containerName="extract" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359623 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="kube-rbac-proxy-node" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359632 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359640 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovn-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359646 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="northd" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359654 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="sbdb" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359662 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359669 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.359751 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359757 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.359837 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" containerName="ovnkube-controller" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.364459 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.372870 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.373037 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/2.log" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.373056 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.373126 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.374472 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/1.log" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.374514 4722 generic.go:334] "Generic (PLEG): container finished" podID="7a80fcd7-8ac4-4e82-8f14-93d225898bb5" containerID="1d82d8ed7e562e39c1ca0e3f5b534a58cb4ab2f7fc1e4e4bea047ded2f5201a2" exitCode=2 Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.374544 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jnvgg" event={"ID":"7a80fcd7-8ac4-4e82-8f14-93d225898bb5","Type":"ContainerDied","Data":"1d82d8ed7e562e39c1ca0e3f5b534a58cb4ab2f7fc1e4e4bea047ded2f5201a2"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.374564 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16"} Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.374955 4722 scope.go:117] "RemoveContainer" containerID="1d82d8ed7e562e39c1ca0e3f5b534a58cb4ab2f7fc1e4e4bea047ded2f5201a2" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.375124 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-jnvgg_openshift-multus(7a80fcd7-8ac4-4e82-8f14-93d225898bb5)\"" pod="openshift-multus/multus-jnvgg" podUID="7a80fcd7-8ac4-4e82-8f14-93d225898bb5" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.379654 4722 scope.go:117] "RemoveContainer" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.403385 4722 scope.go:117] "RemoveContainer" containerID="4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.423359 4722 scope.go:117] "RemoveContainer" containerID="8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424384 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-ovnkube-config\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424416 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-run-netns\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424443 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-slash\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424492 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-run-systemd\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424522 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-systemd-units\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424542 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-node-log\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424567 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-cni-bin\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424597 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-etc-openvswitch\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424621 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-var-lib-openvswitch\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424645 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-run-openvswitch\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424660 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-run-ovn-kubernetes\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424691 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-env-overrides\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424711 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-ovn-node-metrics-cert\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424728 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-log-socket\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424745 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424763 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr6h9\" (UniqueName: \"kubernetes.io/projected/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-kube-api-access-hr6h9\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424783 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-kubelet\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424798 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-run-ovn\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424817 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-cni-netd\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424832 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-ovnkube-script-lib\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424865 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjr2p\" (UniqueName: \"kubernetes.io/projected/5eb7c404-f96e-43a7-b20f-b45d856c75a5-kube-api-access-zjr2p\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424876 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5eb7c404-f96e-43a7-b20f-b45d856c75a5-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424885 4722 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424894 4722 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.424903 4722 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5eb7c404-f96e-43a7-b20f-b45d856c75a5-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.441400 4722 scope.go:117] "RemoveContainer" containerID="2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.459323 4722 scope.go:117] "RemoveContainer" containerID="3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.465206 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vsfln"] Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.470441 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vsfln"] Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.485974 4722 scope.go:117] "RemoveContainer" containerID="e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.500708 4722 scope.go:117] "RemoveContainer" containerID="5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.514753 4722 scope.go:117] "RemoveContainer" containerID="f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.525740 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-var-lib-openvswitch\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.525791 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-run-openvswitch\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.525809 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-run-ovn-kubernetes\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.525840 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-env-overrides\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.525857 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-ovn-node-metrics-cert\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.525874 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-log-socket\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.525906 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-run-openvswitch\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.525906 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-var-lib-openvswitch\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.525953 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-log-socket\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.525889 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526021 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr6h9\" (UniqueName: \"kubernetes.io/projected/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-kube-api-access-hr6h9\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526056 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-kubelet\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526078 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-run-ovn\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526081 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526096 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-cni-netd\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526114 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-ovnkube-script-lib\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526132 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-ovnkube-config\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526131 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-kubelet\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526168 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-run-netns\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526186 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-slash\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526188 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-cni-netd\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526204 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-run-systemd\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526222 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-systemd-units\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526244 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-node-log\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526261 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-cni-bin\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526284 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-etc-openvswitch\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526332 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-etc-openvswitch\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526360 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-slash\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526221 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-run-ovn\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526383 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-node-log\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526426 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-cni-bin\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526449 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-run-systemd\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526486 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-systemd-units\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526527 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-env-overrides\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526583 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-run-ovn-kubernetes\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526671 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-host-run-netns\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.526880 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-ovnkube-script-lib\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.529245 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-ovn-node-metrics-cert\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.529722 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-ovnkube-config\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.530793 4722 scope.go:117] "RemoveContainer" containerID="952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.549709 4722 scope.go:117] "RemoveContainer" containerID="c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.550122 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": container with ID starting with c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d not found: ID does not exist" containerID="c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.550179 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d"} err="failed to get container status \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": rpc error: code = NotFound desc = could not find container \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": container with ID starting with c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.550212 4722 scope.go:117] "RemoveContainer" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.550652 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\": container with ID starting with 5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20 not found: ID does not exist" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.550693 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20"} err="failed to get container status \"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\": rpc error: code = NotFound desc = could not find container \"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\": container with ID starting with 5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.550718 4722 scope.go:117] "RemoveContainer" containerID="4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.551209 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\": container with ID starting with 4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96 not found: ID does not exist" containerID="4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.551238 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96"} err="failed to get container status \"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\": rpc error: code = NotFound desc = could not find container \"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\": container with ID starting with 4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.551258 4722 scope.go:117] "RemoveContainer" containerID="8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.551533 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\": container with ID starting with 8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd not found: ID does not exist" containerID="8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.551583 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd"} err="failed to get container status \"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\": rpc error: code = NotFound desc = could not find container \"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\": container with ID starting with 8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.551616 4722 scope.go:117] "RemoveContainer" containerID="2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.551699 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr6h9\" (UniqueName: \"kubernetes.io/projected/1b2b829e-4254-4c5c-a130-ed72dcc47cc7-kube-api-access-hr6h9\") pod \"ovnkube-node-tmh6g\" (UID: \"1b2b829e-4254-4c5c-a130-ed72dcc47cc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.551882 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\": container with ID starting with 2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627 not found: ID does not exist" containerID="2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.551915 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627"} err="failed to get container status \"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\": rpc error: code = NotFound desc = could not find container \"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\": container with ID starting with 2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.551938 4722 scope.go:117] "RemoveContainer" containerID="3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.552140 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\": container with ID starting with 3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10 not found: ID does not exist" containerID="3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.552196 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10"} err="failed to get container status \"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\": rpc error: code = NotFound desc = could not find container \"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\": container with ID starting with 3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.552214 4722 scope.go:117] "RemoveContainer" containerID="e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.552421 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\": container with ID starting with e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce not found: ID does not exist" containerID="e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.552454 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce"} err="failed to get container status \"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\": rpc error: code = NotFound desc = could not find container \"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\": container with ID starting with e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.552472 4722 scope.go:117] "RemoveContainer" containerID="5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.552707 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\": container with ID starting with 5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700 not found: ID does not exist" containerID="5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.552739 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700"} err="failed to get container status \"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\": rpc error: code = NotFound desc = could not find container \"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\": container with ID starting with 5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.552757 4722 scope.go:117] "RemoveContainer" containerID="f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.552972 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\": container with ID starting with f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9 not found: ID does not exist" containerID="f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.552995 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9"} err="failed to get container status \"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\": rpc error: code = NotFound desc = could not find container \"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\": container with ID starting with f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553008 4722 scope.go:117] "RemoveContainer" containerID="952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1" Feb 19 19:29:45 crc kubenswrapper[4722]: E0219 19:29:45.553191 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\": container with ID starting with 952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1 not found: ID does not exist" containerID="952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553214 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1"} err="failed to get container status \"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\": rpc error: code = NotFound desc = could not find container \"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\": container with ID starting with 952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553227 4722 scope.go:117] "RemoveContainer" containerID="c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553406 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d"} err="failed to get container status \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": rpc error: code = NotFound desc = could not find container \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": container with ID starting with c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553423 4722 scope.go:117] "RemoveContainer" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553616 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20"} err="failed to get container status \"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\": rpc error: code = NotFound desc = could not find container \"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\": container with ID starting with 5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553645 4722 scope.go:117] "RemoveContainer" containerID="4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553804 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96"} err="failed to get container status \"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\": rpc error: code = NotFound desc = could not find container \"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\": container with ID starting with 4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553822 4722 scope.go:117] "RemoveContainer" containerID="8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553975 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd"} err="failed to get container status \"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\": rpc error: code = NotFound desc = could not find container \"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\": container with ID starting with 8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.553990 4722 scope.go:117] "RemoveContainer" containerID="2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.554143 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627"} err="failed to get container status \"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\": rpc error: code = NotFound desc = could not find container \"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\": container with ID starting with 2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.554179 4722 scope.go:117] "RemoveContainer" containerID="3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.554577 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10"} err="failed to get container status \"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\": rpc error: code = NotFound desc = could not find container \"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\": container with ID starting with 3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.554618 4722 scope.go:117] "RemoveContainer" containerID="e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.554933 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce"} err="failed to get container status \"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\": rpc error: code = NotFound desc = could not find container \"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\": container with ID starting with e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.554950 4722 scope.go:117] "RemoveContainer" containerID="5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.555234 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700"} err="failed to get container status \"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\": rpc error: code = NotFound desc = could not find container \"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\": container with ID starting with 5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.555276 4722 scope.go:117] "RemoveContainer" containerID="f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.555564 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9"} err="failed to get container status \"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\": rpc error: code = NotFound desc = could not find container \"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\": container with ID starting with f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.555580 4722 scope.go:117] "RemoveContainer" containerID="952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.555847 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1"} err="failed to get container status \"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\": rpc error: code = NotFound desc = could not find container \"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\": container with ID starting with 952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.555865 4722 scope.go:117] "RemoveContainer" containerID="c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.556192 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d"} err="failed to get container status \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": rpc error: code = NotFound desc = could not find container \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": container with ID starting with c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.556233 4722 scope.go:117] "RemoveContainer" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.556538 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20"} err="failed to get container status \"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\": rpc error: code = NotFound desc = could not find container \"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\": container with ID starting with 5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.556556 4722 scope.go:117] "RemoveContainer" containerID="4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.556755 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96"} err="failed to get container status \"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\": rpc error: code = NotFound desc = could not find container \"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\": container with ID starting with 4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.556780 4722 scope.go:117] "RemoveContainer" containerID="8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.556947 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd"} err="failed to get container status \"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\": rpc error: code = NotFound desc = could not find container \"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\": container with ID starting with 8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.556962 4722 scope.go:117] "RemoveContainer" containerID="2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.557115 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627"} err="failed to get container status \"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\": rpc error: code = NotFound desc = could not find container \"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\": container with ID starting with 2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.557130 4722 scope.go:117] "RemoveContainer" containerID="3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.557433 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10"} err="failed to get container status \"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\": rpc error: code = NotFound desc = could not find container \"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\": container with ID starting with 3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.557464 4722 scope.go:117] "RemoveContainer" containerID="e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.557677 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce"} err="failed to get container status \"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\": rpc error: code = NotFound desc = could not find container \"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\": container with ID starting with e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.557705 4722 scope.go:117] "RemoveContainer" containerID="5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.557899 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700"} err="failed to get container status \"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\": rpc error: code = NotFound desc = could not find container \"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\": container with ID starting with 5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.557926 4722 scope.go:117] "RemoveContainer" containerID="f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.558129 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9"} err="failed to get container status \"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\": rpc error: code = NotFound desc = could not find container \"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\": container with ID starting with f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.558170 4722 scope.go:117] "RemoveContainer" containerID="952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.558434 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1"} err="failed to get container status \"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\": rpc error: code = NotFound desc = could not find container \"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\": container with ID starting with 952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.558461 4722 scope.go:117] "RemoveContainer" containerID="c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.558716 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d"} err="failed to get container status \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": rpc error: code = NotFound desc = could not find container \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": container with ID starting with c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.558741 4722 scope.go:117] "RemoveContainer" containerID="5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.558962 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20"} err="failed to get container status \"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\": rpc error: code = NotFound desc = could not find container \"5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20\": container with ID starting with 5aec2aa52bccb68f7946523f3916bc909a536d05317898b8f1ff62c933f69a20 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.558984 4722 scope.go:117] "RemoveContainer" containerID="4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.559274 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96"} err="failed to get container status \"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\": rpc error: code = NotFound desc = could not find container \"4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96\": container with ID starting with 4fafe5e99eb48f7a77d7ddce57d40ed26a446689734648282374c0ddb47acf96 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.559298 4722 scope.go:117] "RemoveContainer" containerID="8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.559590 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd"} err="failed to get container status \"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\": rpc error: code = NotFound desc = could not find container \"8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd\": container with ID starting with 8a44ebaad5d1872beaa97d73539955df4436bfc84f31fdef79c51e8832a934cd not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.559623 4722 scope.go:117] "RemoveContainer" containerID="2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.559820 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627"} err="failed to get container status \"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\": rpc error: code = NotFound desc = could not find container \"2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627\": container with ID starting with 2368d960710a56435f7075d5c83097eeafc51bae1e0d0b8cecd23547351b2627 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.559838 4722 scope.go:117] "RemoveContainer" containerID="3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.560029 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10"} err="failed to get container status \"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\": rpc error: code = NotFound desc = could not find container \"3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10\": container with ID starting with 3015781dc2da8c90a690de5f8ac06cdaf63398a7505297ac52d0b39032059a10 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.560054 4722 scope.go:117] "RemoveContainer" containerID="e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.560287 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce"} err="failed to get container status \"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\": rpc error: code = NotFound desc = could not find container \"e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce\": container with ID starting with e9de58967733b071a76181a427a403ff3ae2a409ae858dffedcda0601b4e75ce not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.560307 4722 scope.go:117] "RemoveContainer" containerID="5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.560521 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700"} err="failed to get container status \"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\": rpc error: code = NotFound desc = could not find container \"5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700\": container with ID starting with 5304128d02aca8001bd37ab144e59b407850cf51f700ccfb350d44dfa1d5f700 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.560548 4722 scope.go:117] "RemoveContainer" containerID="f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.560764 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9"} err="failed to get container status \"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\": rpc error: code = NotFound desc = could not find container \"f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9\": container with ID starting with f23fe775dbb8833e8ae1d2ac2961a923816e810f9d501d68c7d25fceb06920a9 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.560793 4722 scope.go:117] "RemoveContainer" containerID="952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.561009 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1"} err="failed to get container status \"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\": rpc error: code = NotFound desc = could not find container \"952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1\": container with ID starting with 952f71c6302bc1ea737280f5c1125bf7c6bd72572471827b47472c52f88f93b1 not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.561031 4722 scope.go:117] "RemoveContainer" containerID="c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.561294 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d"} err="failed to get container status \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": rpc error: code = NotFound desc = could not find container \"c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d\": container with ID starting with c4693987e7530116508595c9fcd64a449e25b4cce1911eec8829c2e91abe792d not found: ID does not exist" Feb 19 19:29:45 crc kubenswrapper[4722]: I0219 19:29:45.690711 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:45 crc kubenswrapper[4722]: W0219 19:29:45.707613 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b2b829e_4254_4c5c_a130_ed72dcc47cc7.slice/crio-3c7c392034380ec34e607d974a8ce90e64952e9e07869b00c115d3fb4503337b WatchSource:0}: Error finding container 3c7c392034380ec34e607d974a8ce90e64952e9e07869b00c115d3fb4503337b: Status 404 returned error can't find the container with id 3c7c392034380ec34e607d974a8ce90e64952e9e07869b00c115d3fb4503337b Feb 19 19:29:46 crc kubenswrapper[4722]: I0219 19:29:46.380616 4722 generic.go:334] "Generic (PLEG): container finished" podID="1b2b829e-4254-4c5c-a130-ed72dcc47cc7" containerID="4bf185214f3644a24e32df5d310671d0683ebc3edcb6d28dd643c8846c67184e" exitCode=0 Feb 19 19:29:46 crc kubenswrapper[4722]: I0219 19:29:46.380651 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" event={"ID":"1b2b829e-4254-4c5c-a130-ed72dcc47cc7","Type":"ContainerDied","Data":"4bf185214f3644a24e32df5d310671d0683ebc3edcb6d28dd643c8846c67184e"} Feb 19 19:29:46 crc kubenswrapper[4722]: I0219 19:29:46.380670 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" event={"ID":"1b2b829e-4254-4c5c-a130-ed72dcc47cc7","Type":"ContainerStarted","Data":"3c7c392034380ec34e607d974a8ce90e64952e9e07869b00c115d3fb4503337b"} Feb 19 19:29:47 crc kubenswrapper[4722]: I0219 19:29:47.079587 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eb7c404-f96e-43a7-b20f-b45d856c75a5" path="/var/lib/kubelet/pods/5eb7c404-f96e-43a7-b20f-b45d856c75a5/volumes" Feb 19 19:29:47 crc kubenswrapper[4722]: I0219 19:29:47.387869 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" event={"ID":"1b2b829e-4254-4c5c-a130-ed72dcc47cc7","Type":"ContainerStarted","Data":"1d098cfdd051c75593d774b27da82684e58c737e122ab035584717c4f33a7c37"} Feb 19 19:29:47 crc kubenswrapper[4722]: I0219 19:29:47.388127 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" event={"ID":"1b2b829e-4254-4c5c-a130-ed72dcc47cc7","Type":"ContainerStarted","Data":"6a9ba0f35c9d2704c8f98b593133e289ee645378a869f3082e2c0a01f8c2ef46"} Feb 19 19:29:47 crc kubenswrapper[4722]: I0219 19:29:47.388137 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" event={"ID":"1b2b829e-4254-4c5c-a130-ed72dcc47cc7","Type":"ContainerStarted","Data":"69d72ac00e6a4b8153c6f4a81456f2b8a6fc62140a5cb35d5ba12ff3fb99cd7c"} Feb 19 19:29:47 crc kubenswrapper[4722]: I0219 19:29:47.388145 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" event={"ID":"1b2b829e-4254-4c5c-a130-ed72dcc47cc7","Type":"ContainerStarted","Data":"c8da5603c232c8713e9df3dc78a1c2b636b33a23b756a20515b71b2567ab73a9"} Feb 19 19:29:47 crc kubenswrapper[4722]: I0219 19:29:47.388168 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" event={"ID":"1b2b829e-4254-4c5c-a130-ed72dcc47cc7","Type":"ContainerStarted","Data":"f19279e6e8b313dcdd1cbe3dcdda50dd7a9766155963895e14134ea5a6cc399e"} Feb 19 19:29:47 crc kubenswrapper[4722]: I0219 19:29:47.388179 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" event={"ID":"1b2b829e-4254-4c5c-a130-ed72dcc47cc7","Type":"ContainerStarted","Data":"62d4788570ec88e00e4f2f93a88ca63d605383e3f28ef63c1727f7dd170d1b4f"} Feb 19 19:29:50 crc kubenswrapper[4722]: I0219 19:29:50.403632 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" event={"ID":"1b2b829e-4254-4c5c-a130-ed72dcc47cc7","Type":"ContainerStarted","Data":"a87638205fc09488df3965e599b074028e0c991c41692673bd7cc58945dd41ce"} Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.056337 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn"] Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.056922 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.059143 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.059181 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-ks2vz" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.059507 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.113296 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z55j\" (UniqueName: \"kubernetes.io/projected/572e9436-e389-4b1e-b86f-e13f14f8d3eb-kube-api-access-8z55j\") pod \"obo-prometheus-operator-68bc856cb9-v7lzn\" (UID: \"572e9436-e389-4b1e-b86f-e13f14f8d3eb\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.183413 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq"] Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.184046 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.185924 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-qvqrv" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.186359 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.190906 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb"] Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.191534 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.214286 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc8f56cb-a9d1-4b27-adca-40adf6902cc8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb\" (UID: \"cc8f56cb-a9d1-4b27-adca-40adf6902cc8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.214357 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1577ee2f-abd8-4e61-9fd1-238960e8bdf6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq\" (UID: \"1577ee2f-abd8-4e61-9fd1-238960e8bdf6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.214395 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc8f56cb-a9d1-4b27-adca-40adf6902cc8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb\" (UID: \"cc8f56cb-a9d1-4b27-adca-40adf6902cc8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.214420 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z55j\" (UniqueName: \"kubernetes.io/projected/572e9436-e389-4b1e-b86f-e13f14f8d3eb-kube-api-access-8z55j\") pod \"obo-prometheus-operator-68bc856cb9-v7lzn\" (UID: \"572e9436-e389-4b1e-b86f-e13f14f8d3eb\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.214442 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1577ee2f-abd8-4e61-9fd1-238960e8bdf6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq\" (UID: \"1577ee2f-abd8-4e61-9fd1-238960e8bdf6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.246544 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z55j\" (UniqueName: \"kubernetes.io/projected/572e9436-e389-4b1e-b86f-e13f14f8d3eb-kube-api-access-8z55j\") pod \"obo-prometheus-operator-68bc856cb9-v7lzn\" (UID: \"572e9436-e389-4b1e-b86f-e13f14f8d3eb\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.315886 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1577ee2f-abd8-4e61-9fd1-238960e8bdf6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq\" (UID: \"1577ee2f-abd8-4e61-9fd1-238960e8bdf6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.315994 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc8f56cb-a9d1-4b27-adca-40adf6902cc8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb\" (UID: \"cc8f56cb-a9d1-4b27-adca-40adf6902cc8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.316044 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1577ee2f-abd8-4e61-9fd1-238960e8bdf6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq\" (UID: \"1577ee2f-abd8-4e61-9fd1-238960e8bdf6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.316116 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc8f56cb-a9d1-4b27-adca-40adf6902cc8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb\" (UID: \"cc8f56cb-a9d1-4b27-adca-40adf6902cc8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.322353 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc8f56cb-a9d1-4b27-adca-40adf6902cc8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb\" (UID: \"cc8f56cb-a9d1-4b27-adca-40adf6902cc8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.322625 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1577ee2f-abd8-4e61-9fd1-238960e8bdf6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq\" (UID: \"1577ee2f-abd8-4e61-9fd1-238960e8bdf6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.333714 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc8f56cb-a9d1-4b27-adca-40adf6902cc8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb\" (UID: \"cc8f56cb-a9d1-4b27-adca-40adf6902cc8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.333809 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1577ee2f-abd8-4e61-9fd1-238960e8bdf6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq\" (UID: \"1577ee2f-abd8-4e61-9fd1-238960e8bdf6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.372726 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.392616 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-8xtkk"] Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.393785 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.395701 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-mzsjk" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.395927 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.401756 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(d1de8a316abdeee9128adf0721951ec7412287a40c10b38601a18f9eb6a33498): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.401814 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(d1de8a316abdeee9128adf0721951ec7412287a40c10b38601a18f9eb6a33498): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.401847 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(d1de8a316abdeee9128adf0721951ec7412287a40c10b38601a18f9eb6a33498): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.401890 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators(572e9436-e389-4b1e-b86f-e13f14f8d3eb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators(572e9436-e389-4b1e-b86f-e13f14f8d3eb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(d1de8a316abdeee9128adf0721951ec7412287a40c10b38601a18f9eb6a33498): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" podUID="572e9436-e389-4b1e-b86f-e13f14f8d3eb" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.416888 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/68e6d18b-f149-46fb-ba46-8fb37d82712a-observability-operator-tls\") pod \"observability-operator-59bdc8b94-8xtkk\" (UID: \"68e6d18b-f149-46fb-ba46-8fb37d82712a\") " pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.416949 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8svlc\" (UniqueName: \"kubernetes.io/projected/68e6d18b-f149-46fb-ba46-8fb37d82712a-kube-api-access-8svlc\") pod \"observability-operator-59bdc8b94-8xtkk\" (UID: \"68e6d18b-f149-46fb-ba46-8fb37d82712a\") " pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.500592 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.508474 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.517525 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/68e6d18b-f149-46fb-ba46-8fb37d82712a-observability-operator-tls\") pod \"observability-operator-59bdc8b94-8xtkk\" (UID: \"68e6d18b-f149-46fb-ba46-8fb37d82712a\") " pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.517577 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8svlc\" (UniqueName: \"kubernetes.io/projected/68e6d18b-f149-46fb-ba46-8fb37d82712a-kube-api-access-8svlc\") pod \"observability-operator-59bdc8b94-8xtkk\" (UID: \"68e6d18b-f149-46fb-ba46-8fb37d82712a\") " pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.526444 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(6ca8116209fc4a5fd696013fc65c4c921bc47f0ca7d09f8274ffcef5a72a8dc7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.526524 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(6ca8116209fc4a5fd696013fc65c4c921bc47f0ca7d09f8274ffcef5a72a8dc7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.526554 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(6ca8116209fc4a5fd696013fc65c4c921bc47f0ca7d09f8274ffcef5a72a8dc7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.526614 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators(1577ee2f-abd8-4e61-9fd1-238960e8bdf6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators(1577ee2f-abd8-4e61-9fd1-238960e8bdf6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(6ca8116209fc4a5fd696013fc65c4c921bc47f0ca7d09f8274ffcef5a72a8dc7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" podUID="1577ee2f-abd8-4e61-9fd1-238960e8bdf6" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.528668 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/68e6d18b-f149-46fb-ba46-8fb37d82712a-observability-operator-tls\") pod \"observability-operator-59bdc8b94-8xtkk\" (UID: \"68e6d18b-f149-46fb-ba46-8fb37d82712a\") " pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.538779 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8svlc\" (UniqueName: \"kubernetes.io/projected/68e6d18b-f149-46fb-ba46-8fb37d82712a-kube-api-access-8svlc\") pod \"observability-operator-59bdc8b94-8xtkk\" (UID: \"68e6d18b-f149-46fb-ba46-8fb37d82712a\") " pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.540346 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(22465c88df50cd4c95117b7d3d2acbbf9288d1d3cc43f50f980f1c58ee81c190): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.540429 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(22465c88df50cd4c95117b7d3d2acbbf9288d1d3cc43f50f980f1c58ee81c190): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.540452 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(22465c88df50cd4c95117b7d3d2acbbf9288d1d3cc43f50f980f1c58ee81c190): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.540496 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators(cc8f56cb-a9d1-4b27-adca-40adf6902cc8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators(cc8f56cb-a9d1-4b27-adca-40adf6902cc8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(22465c88df50cd4c95117b7d3d2acbbf9288d1d3cc43f50f980f1c58ee81c190): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" podUID="cc8f56cb-a9d1-4b27-adca-40adf6902cc8" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.597740 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-4qpbt"] Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.598622 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.600478 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-gvb9x" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.619859 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fnfx\" (UniqueName: \"kubernetes.io/projected/7f659845-54cc-4e5c-892c-a754900c1f39-kube-api-access-7fnfx\") pod \"perses-operator-5bf474d74f-4qpbt\" (UID: \"7f659845-54cc-4e5c-892c-a754900c1f39\") " pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.619908 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/7f659845-54cc-4e5c-892c-a754900c1f39-openshift-service-ca\") pod \"perses-operator-5bf474d74f-4qpbt\" (UID: \"7f659845-54cc-4e5c-892c-a754900c1f39\") " pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.721312 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fnfx\" (UniqueName: \"kubernetes.io/projected/7f659845-54cc-4e5c-892c-a754900c1f39-kube-api-access-7fnfx\") pod \"perses-operator-5bf474d74f-4qpbt\" (UID: \"7f659845-54cc-4e5c-892c-a754900c1f39\") " pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.721367 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/7f659845-54cc-4e5c-892c-a754900c1f39-openshift-service-ca\") pod \"perses-operator-5bf474d74f-4qpbt\" (UID: \"7f659845-54cc-4e5c-892c-a754900c1f39\") " pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.722258 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/7f659845-54cc-4e5c-892c-a754900c1f39-openshift-service-ca\") pod \"perses-operator-5bf474d74f-4qpbt\" (UID: \"7f659845-54cc-4e5c-892c-a754900c1f39\") " pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.747173 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fnfx\" (UniqueName: \"kubernetes.io/projected/7f659845-54cc-4e5c-892c-a754900c1f39-kube-api-access-7fnfx\") pod \"perses-operator-5bf474d74f-4qpbt\" (UID: \"7f659845-54cc-4e5c-892c-a754900c1f39\") " pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.750897 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.789471 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(ac303360ae86f9fcfce1f995d1b438873953ce48bedd93b87bad2b98e98b6b0e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.789549 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(ac303360ae86f9fcfce1f995d1b438873953ce48bedd93b87bad2b98e98b6b0e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.789570 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(ac303360ae86f9fcfce1f995d1b438873953ce48bedd93b87bad2b98e98b6b0e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.789621 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-8xtkk_openshift-operators(68e6d18b-f149-46fb-ba46-8fb37d82712a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-8xtkk_openshift-operators(68e6d18b-f149-46fb-ba46-8fb37d82712a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(ac303360ae86f9fcfce1f995d1b438873953ce48bedd93b87bad2b98e98b6b0e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" podUID="68e6d18b-f149-46fb-ba46-8fb37d82712a" Feb 19 19:29:51 crc kubenswrapper[4722]: I0219 19:29:51.913458 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.938812 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(49c7969d6156d2a95e26d3ef22dba9a38d01493541312eb47d2ae5f508f8de74): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.938892 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(49c7969d6156d2a95e26d3ef22dba9a38d01493541312eb47d2ae5f508f8de74): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.938923 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(49c7969d6156d2a95e26d3ef22dba9a38d01493541312eb47d2ae5f508f8de74): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:51 crc kubenswrapper[4722]: E0219 19:29:51.938979 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-4qpbt_openshift-operators(7f659845-54cc-4e5c-892c-a754900c1f39)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-4qpbt_openshift-operators(7f659845-54cc-4e5c-892c-a754900c1f39)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(49c7969d6156d2a95e26d3ef22dba9a38d01493541312eb47d2ae5f508f8de74): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" podUID="7f659845-54cc-4e5c-892c-a754900c1f39" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.417422 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" event={"ID":"1b2b829e-4254-4c5c-a130-ed72dcc47cc7","Type":"ContainerStarted","Data":"6116fe360f91a9c7c2b769555ed6c7f2c97cfbaa39edbdb83476ab624f25d005"} Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.417723 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.417777 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.447886 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.450909 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" podStartSLOduration=7.450892913 podStartE2EDuration="7.450892913s" podCreationTimestamp="2026-02-19 19:29:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:29:52.447179457 +0000 UTC m=+692.059529781" watchObservedRunningTime="2026-02-19 19:29:52.450892913 +0000 UTC m=+692.063243237" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.516093 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq"] Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.516242 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.516719 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.521874 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn"] Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.522033 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.522522 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.526720 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb"] Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.526844 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.527372 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.534136 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-4qpbt"] Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.534315 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.534691 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.537597 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-8xtkk"] Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.537679 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:52 crc kubenswrapper[4722]: I0219 19:29:52.537983 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.597347 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(11f0da367a9fc428716db27873592ef93314630df3e331eada86cb535bc2f34d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.597418 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(11f0da367a9fc428716db27873592ef93314630df3e331eada86cb535bc2f34d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.597441 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(11f0da367a9fc428716db27873592ef93314630df3e331eada86cb535bc2f34d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.597493 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators(572e9436-e389-4b1e-b86f-e13f14f8d3eb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators(572e9436-e389-4b1e-b86f-e13f14f8d3eb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(11f0da367a9fc428716db27873592ef93314630df3e331eada86cb535bc2f34d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" podUID="572e9436-e389-4b1e-b86f-e13f14f8d3eb" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.602464 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(0ef031778c1e9c40254a5d75805f7cab206939808ebc1f423d5258c895b3f631): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.602527 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(0ef031778c1e9c40254a5d75805f7cab206939808ebc1f423d5258c895b3f631): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.602546 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(0ef031778c1e9c40254a5d75805f7cab206939808ebc1f423d5258c895b3f631): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.602595 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators(1577ee2f-abd8-4e61-9fd1-238960e8bdf6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators(1577ee2f-abd8-4e61-9fd1-238960e8bdf6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(0ef031778c1e9c40254a5d75805f7cab206939808ebc1f423d5258c895b3f631): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" podUID="1577ee2f-abd8-4e61-9fd1-238960e8bdf6" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.619540 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(918c94aa7e5d540ea69f8fab65254bd6041ca64eaf44e85af4c7930c08a01abc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.619598 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(918c94aa7e5d540ea69f8fab65254bd6041ca64eaf44e85af4c7930c08a01abc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.619621 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(918c94aa7e5d540ea69f8fab65254bd6041ca64eaf44e85af4c7930c08a01abc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.619665 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators(cc8f56cb-a9d1-4b27-adca-40adf6902cc8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators(cc8f56cb-a9d1-4b27-adca-40adf6902cc8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(918c94aa7e5d540ea69f8fab65254bd6041ca64eaf44e85af4c7930c08a01abc): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" podUID="cc8f56cb-a9d1-4b27-adca-40adf6902cc8" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.625104 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(f1f1922c6b9c837aa8e35894c04090b1f5e4f03086dc3018a56de4ad658f5abb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.625187 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(f1f1922c6b9c837aa8e35894c04090b1f5e4f03086dc3018a56de4ad658f5abb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.625205 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(f1f1922c6b9c837aa8e35894c04090b1f5e4f03086dc3018a56de4ad658f5abb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.625237 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-8xtkk_openshift-operators(68e6d18b-f149-46fb-ba46-8fb37d82712a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-8xtkk_openshift-operators(68e6d18b-f149-46fb-ba46-8fb37d82712a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(f1f1922c6b9c837aa8e35894c04090b1f5e4f03086dc3018a56de4ad658f5abb): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" podUID="68e6d18b-f149-46fb-ba46-8fb37d82712a" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.633263 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(330f81bbf641e5bfb522269d82debeeb49bf4ea92db1989cb86db8f9e327d996): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.633326 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(330f81bbf641e5bfb522269d82debeeb49bf4ea92db1989cb86db8f9e327d996): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.633347 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(330f81bbf641e5bfb522269d82debeeb49bf4ea92db1989cb86db8f9e327d996): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:29:52 crc kubenswrapper[4722]: E0219 19:29:52.633393 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-4qpbt_openshift-operators(7f659845-54cc-4e5c-892c-a754900c1f39)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-4qpbt_openshift-operators(7f659845-54cc-4e5c-892c-a754900c1f39)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(330f81bbf641e5bfb522269d82debeeb49bf4ea92db1989cb86db8f9e327d996): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" podUID="7f659845-54cc-4e5c-892c-a754900c1f39" Feb 19 19:29:53 crc kubenswrapper[4722]: I0219 19:29:53.436914 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:53 crc kubenswrapper[4722]: I0219 19:29:53.461928 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:29:58 crc kubenswrapper[4722]: I0219 19:29:58.071129 4722 scope.go:117] "RemoveContainer" containerID="1d82d8ed7e562e39c1ca0e3f5b534a58cb4ab2f7fc1e4e4bea047ded2f5201a2" Feb 19 19:29:58 crc kubenswrapper[4722]: E0219 19:29:58.072080 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-jnvgg_openshift-multus(7a80fcd7-8ac4-4e82-8f14-93d225898bb5)\"" pod="openshift-multus/multus-jnvgg" podUID="7a80fcd7-8ac4-4e82-8f14-93d225898bb5" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.165535 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf"] Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.166394 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.170691 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.171024 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.180723 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf"] Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.320624 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6513190-cf4a-405f-a7ca-c35f37d63725-secret-volume\") pod \"collect-profiles-29525490-f85qf\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.320951 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6513190-cf4a-405f-a7ca-c35f37d63725-config-volume\") pod \"collect-profiles-29525490-f85qf\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.320999 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs9pt\" (UniqueName: \"kubernetes.io/projected/b6513190-cf4a-405f-a7ca-c35f37d63725-kube-api-access-cs9pt\") pod \"collect-profiles-29525490-f85qf\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.422145 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs9pt\" (UniqueName: \"kubernetes.io/projected/b6513190-cf4a-405f-a7ca-c35f37d63725-kube-api-access-cs9pt\") pod \"collect-profiles-29525490-f85qf\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.422346 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6513190-cf4a-405f-a7ca-c35f37d63725-secret-volume\") pod \"collect-profiles-29525490-f85qf\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.422379 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6513190-cf4a-405f-a7ca-c35f37d63725-config-volume\") pod \"collect-profiles-29525490-f85qf\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.423520 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6513190-cf4a-405f-a7ca-c35f37d63725-config-volume\") pod \"collect-profiles-29525490-f85qf\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.442875 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6513190-cf4a-405f-a7ca-c35f37d63725-secret-volume\") pod \"collect-profiles-29525490-f85qf\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.468303 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs9pt\" (UniqueName: \"kubernetes.io/projected/b6513190-cf4a-405f-a7ca-c35f37d63725-kube-api-access-cs9pt\") pod \"collect-profiles-29525490-f85qf\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: I0219 19:30:00.490301 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: E0219 19:30:00.517607 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager_b6513190-cf4a-405f-a7ca-c35f37d63725_0(5478c4bb4ffadd6b155b6703872b4b0c20b42c0a9c3a0e8e1159fdb332c910ef): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:30:00 crc kubenswrapper[4722]: E0219 19:30:00.517732 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager_b6513190-cf4a-405f-a7ca-c35f37d63725_0(5478c4bb4ffadd6b155b6703872b4b0c20b42c0a9c3a0e8e1159fdb332c910ef): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: E0219 19:30:00.517805 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager_b6513190-cf4a-405f-a7ca-c35f37d63725_0(5478c4bb4ffadd6b155b6703872b4b0c20b42c0a9c3a0e8e1159fdb332c910ef): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:00 crc kubenswrapper[4722]: E0219 19:30:00.517905 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager(b6513190-cf4a-405f-a7ca-c35f37d63725)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager(b6513190-cf4a-405f-a7ca-c35f37d63725)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager_b6513190-cf4a-405f-a7ca-c35f37d63725_0(5478c4bb4ffadd6b155b6703872b4b0c20b42c0a9c3a0e8e1159fdb332c910ef): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" podUID="b6513190-cf4a-405f-a7ca-c35f37d63725" Feb 19 19:30:01 crc kubenswrapper[4722]: I0219 19:30:01.479715 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:01 crc kubenswrapper[4722]: I0219 19:30:01.480133 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:01 crc kubenswrapper[4722]: E0219 19:30:01.500742 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager_b6513190-cf4a-405f-a7ca-c35f37d63725_0(2a176446a67bb6e0c598b8c2b6082789810fa964536e3024791528428ab6d1f7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:30:01 crc kubenswrapper[4722]: E0219 19:30:01.500813 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager_b6513190-cf4a-405f-a7ca-c35f37d63725_0(2a176446a67bb6e0c598b8c2b6082789810fa964536e3024791528428ab6d1f7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:01 crc kubenswrapper[4722]: E0219 19:30:01.500845 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager_b6513190-cf4a-405f-a7ca-c35f37d63725_0(2a176446a67bb6e0c598b8c2b6082789810fa964536e3024791528428ab6d1f7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:01 crc kubenswrapper[4722]: E0219 19:30:01.500904 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager(b6513190-cf4a-405f-a7ca-c35f37d63725)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager(b6513190-cf4a-405f-a7ca-c35f37d63725)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29525490-f85qf_openshift-operator-lifecycle-manager_b6513190-cf4a-405f-a7ca-c35f37d63725_0(2a176446a67bb6e0c598b8c2b6082789810fa964536e3024791528428ab6d1f7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" podUID="b6513190-cf4a-405f-a7ca-c35f37d63725" Feb 19 19:30:05 crc kubenswrapper[4722]: I0219 19:30:05.071368 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:30:05 crc kubenswrapper[4722]: I0219 19:30:05.071395 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:30:05 crc kubenswrapper[4722]: I0219 19:30:05.072886 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:30:05 crc kubenswrapper[4722]: I0219 19:30:05.073188 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:30:05 crc kubenswrapper[4722]: E0219 19:30:05.116680 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(4865f878f36d6c29a60dd9eac09b8cf71eb31289eab4125f075438507c23b38a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:30:05 crc kubenswrapper[4722]: E0219 19:30:05.116759 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(4865f878f36d6c29a60dd9eac09b8cf71eb31289eab4125f075438507c23b38a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:30:05 crc kubenswrapper[4722]: E0219 19:30:05.116795 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(4865f878f36d6c29a60dd9eac09b8cf71eb31289eab4125f075438507c23b38a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:30:05 crc kubenswrapper[4722]: E0219 19:30:05.116843 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-4qpbt_openshift-operators(7f659845-54cc-4e5c-892c-a754900c1f39)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-4qpbt_openshift-operators(7f659845-54cc-4e5c-892c-a754900c1f39)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-4qpbt_openshift-operators_7f659845-54cc-4e5c-892c-a754900c1f39_0(4865f878f36d6c29a60dd9eac09b8cf71eb31289eab4125f075438507c23b38a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" podUID="7f659845-54cc-4e5c-892c-a754900c1f39" Feb 19 19:30:05 crc kubenswrapper[4722]: E0219 19:30:05.123355 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(df8919c4951ac65ca6e5ae3c886d39f03b10419a5780f5020ef64f519433b21a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:30:05 crc kubenswrapper[4722]: E0219 19:30:05.123416 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(df8919c4951ac65ca6e5ae3c886d39f03b10419a5780f5020ef64f519433b21a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:30:05 crc kubenswrapper[4722]: E0219 19:30:05.123436 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(df8919c4951ac65ca6e5ae3c886d39f03b10419a5780f5020ef64f519433b21a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:30:05 crc kubenswrapper[4722]: E0219 19:30:05.123479 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators(cc8f56cb-a9d1-4b27-adca-40adf6902cc8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators(cc8f56cb-a9d1-4b27-adca-40adf6902cc8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_openshift-operators_cc8f56cb-a9d1-4b27-adca-40adf6902cc8_0(df8919c4951ac65ca6e5ae3c886d39f03b10419a5780f5020ef64f519433b21a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" podUID="cc8f56cb-a9d1-4b27-adca-40adf6902cc8" Feb 19 19:30:06 crc kubenswrapper[4722]: I0219 19:30:06.070921 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:30:06 crc kubenswrapper[4722]: I0219 19:30:06.071644 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:30:06 crc kubenswrapper[4722]: E0219 19:30:06.092191 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(37c2d2aafbb3db7f5d4e16b1d702dbc19a6acb2778642c57eb3c3c7d58a3bc1c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:30:06 crc kubenswrapper[4722]: E0219 19:30:06.092257 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(37c2d2aafbb3db7f5d4e16b1d702dbc19a6acb2778642c57eb3c3c7d58a3bc1c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:30:06 crc kubenswrapper[4722]: E0219 19:30:06.092277 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(37c2d2aafbb3db7f5d4e16b1d702dbc19a6acb2778642c57eb3c3c7d58a3bc1c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:30:06 crc kubenswrapper[4722]: E0219 19:30:06.092320 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-8xtkk_openshift-operators(68e6d18b-f149-46fb-ba46-8fb37d82712a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-8xtkk_openshift-operators(68e6d18b-f149-46fb-ba46-8fb37d82712a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-8xtkk_openshift-operators_68e6d18b-f149-46fb-ba46-8fb37d82712a_0(37c2d2aafbb3db7f5d4e16b1d702dbc19a6acb2778642c57eb3c3c7d58a3bc1c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" podUID="68e6d18b-f149-46fb-ba46-8fb37d82712a" Feb 19 19:30:07 crc kubenswrapper[4722]: I0219 19:30:07.071297 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:30:07 crc kubenswrapper[4722]: I0219 19:30:07.071792 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:30:07 crc kubenswrapper[4722]: E0219 19:30:07.100612 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(f63451e7a444f406b43e27252995d71d3f5c16c7888ac2d2d1e6301fca816bbd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:30:07 crc kubenswrapper[4722]: E0219 19:30:07.100686 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(f63451e7a444f406b43e27252995d71d3f5c16c7888ac2d2d1e6301fca816bbd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:30:07 crc kubenswrapper[4722]: E0219 19:30:07.100722 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(f63451e7a444f406b43e27252995d71d3f5c16c7888ac2d2d1e6301fca816bbd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:30:07 crc kubenswrapper[4722]: E0219 19:30:07.100767 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators(572e9436-e389-4b1e-b86f-e13f14f8d3eb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators(572e9436-e389-4b1e-b86f-e13f14f8d3eb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-v7lzn_openshift-operators_572e9436-e389-4b1e-b86f-e13f14f8d3eb_0(f63451e7a444f406b43e27252995d71d3f5c16c7888ac2d2d1e6301fca816bbd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" podUID="572e9436-e389-4b1e-b86f-e13f14f8d3eb" Feb 19 19:30:08 crc kubenswrapper[4722]: I0219 19:30:08.071261 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:30:08 crc kubenswrapper[4722]: I0219 19:30:08.071778 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:30:08 crc kubenswrapper[4722]: E0219 19:30:08.103222 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(d4bf68679a1be6a3d4d32077ab87bbac42a42fb4c01f8d9751dc5ebb95438918): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 19:30:08 crc kubenswrapper[4722]: E0219 19:30:08.104177 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(d4bf68679a1be6a3d4d32077ab87bbac42a42fb4c01f8d9751dc5ebb95438918): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:30:08 crc kubenswrapper[4722]: E0219 19:30:08.104310 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(d4bf68679a1be6a3d4d32077ab87bbac42a42fb4c01f8d9751dc5ebb95438918): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:30:08 crc kubenswrapper[4722]: E0219 19:30:08.104454 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators(1577ee2f-abd8-4e61-9fd1-238960e8bdf6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators(1577ee2f-abd8-4e61-9fd1-238960e8bdf6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_openshift-operators_1577ee2f-abd8-4e61-9fd1-238960e8bdf6_0(d4bf68679a1be6a3d4d32077ab87bbac42a42fb4c01f8d9751dc5ebb95438918): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" podUID="1577ee2f-abd8-4e61-9fd1-238960e8bdf6" Feb 19 19:30:12 crc kubenswrapper[4722]: I0219 19:30:12.070870 4722 scope.go:117] "RemoveContainer" containerID="1d82d8ed7e562e39c1ca0e3f5b534a58cb4ab2f7fc1e4e4bea047ded2f5201a2" Feb 19 19:30:12 crc kubenswrapper[4722]: I0219 19:30:12.535552 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/2.log" Feb 19 19:30:12 crc kubenswrapper[4722]: I0219 19:30:12.536751 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/1.log" Feb 19 19:30:12 crc kubenswrapper[4722]: I0219 19:30:12.536826 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jnvgg" event={"ID":"7a80fcd7-8ac4-4e82-8f14-93d225898bb5","Type":"ContainerStarted","Data":"c50852e5b77d05de6aa6ecb4533ea82b9f06d7b4cf8cb98687ee0f15fe36d8dc"} Feb 19 19:30:15 crc kubenswrapper[4722]: I0219 19:30:15.711369 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmh6g" Feb 19 19:30:17 crc kubenswrapper[4722]: I0219 19:30:17.071052 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:17 crc kubenswrapper[4722]: I0219 19:30:17.071120 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:30:17 crc kubenswrapper[4722]: I0219 19:30:17.071691 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:17 crc kubenswrapper[4722]: I0219 19:30:17.071888 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" Feb 19 19:30:17 crc kubenswrapper[4722]: I0219 19:30:17.489258 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb"] Feb 19 19:30:17 crc kubenswrapper[4722]: I0219 19:30:17.492688 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf"] Feb 19 19:30:17 crc kubenswrapper[4722]: I0219 19:30:17.564350 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" event={"ID":"cc8f56cb-a9d1-4b27-adca-40adf6902cc8","Type":"ContainerStarted","Data":"78f276ac129300e83d241e7681abee219295215443de74bfb3aa4ae2e9b7e9da"} Feb 19 19:30:17 crc kubenswrapper[4722]: I0219 19:30:17.565488 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" event={"ID":"b6513190-cf4a-405f-a7ca-c35f37d63725","Type":"ContainerStarted","Data":"16269b9bb52f2154f76e0376730fd1345c43b7c76b17bb53cb71d2835a369a19"} Feb 19 19:30:18 crc kubenswrapper[4722]: I0219 19:30:18.574924 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" event={"ID":"b6513190-cf4a-405f-a7ca-c35f37d63725","Type":"ContainerStarted","Data":"1a0fba6d0ff68b77b5d4af6abf07f7a3a985db19a68b0e1561f090e9701e0cbe"} Feb 19 19:30:18 crc kubenswrapper[4722]: I0219 19:30:18.593589 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" podStartSLOduration=18.593556372 podStartE2EDuration="18.593556372s" podCreationTimestamp="2026-02-19 19:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:30:18.590798966 +0000 UTC m=+718.203149320" watchObservedRunningTime="2026-02-19 19:30:18.593556372 +0000 UTC m=+718.205906726" Feb 19 19:30:19 crc kubenswrapper[4722]: I0219 19:30:19.583250 4722 generic.go:334] "Generic (PLEG): container finished" podID="b6513190-cf4a-405f-a7ca-c35f37d63725" containerID="1a0fba6d0ff68b77b5d4af6abf07f7a3a985db19a68b0e1561f090e9701e0cbe" exitCode=0 Feb 19 19:30:19 crc kubenswrapper[4722]: I0219 19:30:19.583310 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" event={"ID":"b6513190-cf4a-405f-a7ca-c35f37d63725","Type":"ContainerDied","Data":"1a0fba6d0ff68b77b5d4af6abf07f7a3a985db19a68b0e1561f090e9701e0cbe"} Feb 19 19:30:20 crc kubenswrapper[4722]: I0219 19:30:20.070205 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:30:20 crc kubenswrapper[4722]: I0219 19:30:20.070290 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:30:20 crc kubenswrapper[4722]: I0219 19:30:20.071051 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:30:20 crc kubenswrapper[4722]: I0219 19:30:20.071272 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.070846 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.075389 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.269783 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.333408 4722 scope.go:117] "RemoveContainer" containerID="38ef2c66146d445f8a65e2065c010337765f05c7cc37a1017067b2143036fa16" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.398575 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6513190-cf4a-405f-a7ca-c35f37d63725-secret-volume\") pod \"b6513190-cf4a-405f-a7ca-c35f37d63725\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.399050 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6513190-cf4a-405f-a7ca-c35f37d63725-config-volume\") pod \"b6513190-cf4a-405f-a7ca-c35f37d63725\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.399088 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs9pt\" (UniqueName: \"kubernetes.io/projected/b6513190-cf4a-405f-a7ca-c35f37d63725-kube-api-access-cs9pt\") pod \"b6513190-cf4a-405f-a7ca-c35f37d63725\" (UID: \"b6513190-cf4a-405f-a7ca-c35f37d63725\") " Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.400483 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6513190-cf4a-405f-a7ca-c35f37d63725-config-volume" (OuterVolumeSpecName: "config-volume") pod "b6513190-cf4a-405f-a7ca-c35f37d63725" (UID: "b6513190-cf4a-405f-a7ca-c35f37d63725"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.400741 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6513190-cf4a-405f-a7ca-c35f37d63725-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.406237 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6513190-cf4a-405f-a7ca-c35f37d63725-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b6513190-cf4a-405f-a7ca-c35f37d63725" (UID: "b6513190-cf4a-405f-a7ca-c35f37d63725"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.406326 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6513190-cf4a-405f-a7ca-c35f37d63725-kube-api-access-cs9pt" (OuterVolumeSpecName: "kube-api-access-cs9pt") pod "b6513190-cf4a-405f-a7ca-c35f37d63725" (UID: "b6513190-cf4a-405f-a7ca-c35f37d63725"). InnerVolumeSpecName "kube-api-access-cs9pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.501542 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs9pt\" (UniqueName: \"kubernetes.io/projected/b6513190-cf4a-405f-a7ca-c35f37d63725-kube-api-access-cs9pt\") on node \"crc\" DevicePath \"\"" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.501573 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6513190-cf4a-405f-a7ca-c35f37d63725-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.600033 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" event={"ID":"b6513190-cf4a-405f-a7ca-c35f37d63725","Type":"ContainerDied","Data":"16269b9bb52f2154f76e0376730fd1345c43b7c76b17bb53cb71d2835a369a19"} Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.600072 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16269b9bb52f2154f76e0376730fd1345c43b7c76b17bb53cb71d2835a369a19" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.600124 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.607634 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" event={"ID":"cc8f56cb-a9d1-4b27-adca-40adf6902cc8","Type":"ContainerStarted","Data":"82558eb0c28911811b707e4a99b732be0a01d144dafe7608a10ed064d3554ef4"} Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.610637 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jnvgg_7a80fcd7-8ac4-4e82-8f14-93d225898bb5/kube-multus/2.log" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.611366 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-8xtkk"] Feb 19 19:30:21 crc kubenswrapper[4722]: W0219 19:30:21.615589 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68e6d18b_f149_46fb_ba46_8fb37d82712a.slice/crio-e669936a41e3be3c1c9a141c035aaa9754ee57e85671907688ab095cc11b1d74 WatchSource:0}: Error finding container e669936a41e3be3c1c9a141c035aaa9754ee57e85671907688ab095cc11b1d74: Status 404 returned error can't find the container with id e669936a41e3be3c1c9a141c035aaa9754ee57e85671907688ab095cc11b1d74 Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.644132 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb" podStartSLOduration=26.851998416 podStartE2EDuration="30.644100392s" podCreationTimestamp="2026-02-19 19:29:51 +0000 UTC" firstStartedPulling="2026-02-19 19:30:17.503587668 +0000 UTC m=+717.115937992" lastFinishedPulling="2026-02-19 19:30:21.295689634 +0000 UTC m=+720.908039968" observedRunningTime="2026-02-19 19:30:21.63597583 +0000 UTC m=+721.248326164" watchObservedRunningTime="2026-02-19 19:30:21.644100392 +0000 UTC m=+721.256450736" Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.695804 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn"] Feb 19 19:30:21 crc kubenswrapper[4722]: W0219 19:30:21.701339 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod572e9436_e389_4b1e_b86f_e13f14f8d3eb.slice/crio-5fe4fe6f64a546b0a7e5ae14ec212e1303fe8a7daa57612693d29e74a0164b02 WatchSource:0}: Error finding container 5fe4fe6f64a546b0a7e5ae14ec212e1303fe8a7daa57612693d29e74a0164b02: Status 404 returned error can't find the container with id 5fe4fe6f64a546b0a7e5ae14ec212e1303fe8a7daa57612693d29e74a0164b02 Feb 19 19:30:21 crc kubenswrapper[4722]: W0219 19:30:21.702340 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f659845_54cc_4e5c_892c_a754900c1f39.slice/crio-38ada98456e2e6eab6b764e8137edf31407c59642811be53f505cff17c150265 WatchSource:0}: Error finding container 38ada98456e2e6eab6b764e8137edf31407c59642811be53f505cff17c150265: Status 404 returned error can't find the container with id 38ada98456e2e6eab6b764e8137edf31407c59642811be53f505cff17c150265 Feb 19 19:30:21 crc kubenswrapper[4722]: I0219 19:30:21.705095 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-4qpbt"] Feb 19 19:30:22 crc kubenswrapper[4722]: I0219 19:30:22.630574 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" event={"ID":"7f659845-54cc-4e5c-892c-a754900c1f39","Type":"ContainerStarted","Data":"38ada98456e2e6eab6b764e8137edf31407c59642811be53f505cff17c150265"} Feb 19 19:30:22 crc kubenswrapper[4722]: I0219 19:30:22.633908 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" event={"ID":"68e6d18b-f149-46fb-ba46-8fb37d82712a","Type":"ContainerStarted","Data":"e669936a41e3be3c1c9a141c035aaa9754ee57e85671907688ab095cc11b1d74"} Feb 19 19:30:22 crc kubenswrapper[4722]: I0219 19:30:22.636663 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" event={"ID":"572e9436-e389-4b1e-b86f-e13f14f8d3eb","Type":"ContainerStarted","Data":"5fe4fe6f64a546b0a7e5ae14ec212e1303fe8a7daa57612693d29e74a0164b02"} Feb 19 19:30:23 crc kubenswrapper[4722]: I0219 19:30:23.071390 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:30:23 crc kubenswrapper[4722]: I0219 19:30:23.071839 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" Feb 19 19:30:23 crc kubenswrapper[4722]: I0219 19:30:23.662401 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq"] Feb 19 19:30:23 crc kubenswrapper[4722]: W0219 19:30:23.927203 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1577ee2f_abd8_4e61_9fd1_238960e8bdf6.slice/crio-2a09a240d83e1ce9a325e681b6a552c20419cc9734e684d1329b838f5d3a1a2c WatchSource:0}: Error finding container 2a09a240d83e1ce9a325e681b6a552c20419cc9734e684d1329b838f5d3a1a2c: Status 404 returned error can't find the container with id 2a09a240d83e1ce9a325e681b6a552c20419cc9734e684d1329b838f5d3a1a2c Feb 19 19:30:24 crc kubenswrapper[4722]: I0219 19:30:24.651377 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" event={"ID":"1577ee2f-abd8-4e61-9fd1-238960e8bdf6","Type":"ContainerStarted","Data":"2a09a240d83e1ce9a325e681b6a552c20419cc9734e684d1329b838f5d3a1a2c"} Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.664650 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" event={"ID":"7f659845-54cc-4e5c-892c-a754900c1f39","Type":"ContainerStarted","Data":"0bd5ee31c75a951c493b2c2ca4402b386531ee61a599ed5fc4021920db0ef246"} Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.665007 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.665898 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" event={"ID":"68e6d18b-f149-46fb-ba46-8fb37d82712a","Type":"ContainerStarted","Data":"5cf514f490e4904f3f000080860a51c110530446d7f150f694a82586d5ff7c5f"} Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.666362 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.667359 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" event={"ID":"1577ee2f-abd8-4e61-9fd1-238960e8bdf6","Type":"ContainerStarted","Data":"c97c43d59f507122bdf9409502429cb4ae989ab87c9692e9bf196d9d27e6bb2c"} Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.669443 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" event={"ID":"572e9436-e389-4b1e-b86f-e13f14f8d3eb","Type":"ContainerStarted","Data":"a6ccd71954305587c05c2fb66e8fdb36db4f19f68f0578f966deccff3695ed36"} Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.679846 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" podStartSLOduration=31.329892945 podStartE2EDuration="35.679830553s" podCreationTimestamp="2026-02-19 19:29:51 +0000 UTC" firstStartedPulling="2026-02-19 19:30:21.705987928 +0000 UTC m=+721.318338252" lastFinishedPulling="2026-02-19 19:30:26.055925536 +0000 UTC m=+725.668275860" observedRunningTime="2026-02-19 19:30:26.678545552 +0000 UTC m=+726.290895886" watchObservedRunningTime="2026-02-19 19:30:26.679830553 +0000 UTC m=+726.292180877" Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.701719 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" podStartSLOduration=31.232645089 podStartE2EDuration="35.701698393s" podCreationTimestamp="2026-02-19 19:29:51 +0000 UTC" firstStartedPulling="2026-02-19 19:30:21.618385952 +0000 UTC m=+721.230736276" lastFinishedPulling="2026-02-19 19:30:26.087439256 +0000 UTC m=+725.699789580" observedRunningTime="2026-02-19 19:30:26.698329398 +0000 UTC m=+726.310679742" watchObservedRunningTime="2026-02-19 19:30:26.701698393 +0000 UTC m=+726.314048717" Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.704671 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.721431 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq" podStartSLOduration=35.721412636 podStartE2EDuration="35.721412636s" podCreationTimestamp="2026-02-19 19:29:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:30:26.719622221 +0000 UTC m=+726.331972565" watchObservedRunningTime="2026-02-19 19:30:26.721412636 +0000 UTC m=+726.333762970" Feb 19 19:30:26 crc kubenswrapper[4722]: I0219 19:30:26.751471 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-v7lzn" podStartSLOduration=31.405899627 podStartE2EDuration="35.75145103s" podCreationTimestamp="2026-02-19 19:29:51 +0000 UTC" firstStartedPulling="2026-02-19 19:30:21.704518121 +0000 UTC m=+721.316868455" lastFinishedPulling="2026-02-19 19:30:26.050069534 +0000 UTC m=+725.662419858" observedRunningTime="2026-02-19 19:30:26.746442965 +0000 UTC m=+726.358793289" watchObservedRunningTime="2026-02-19 19:30:26.75145103 +0000 UTC m=+726.363801354" Feb 19 19:30:31 crc kubenswrapper[4722]: I0219 19:30:31.916588 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-4qpbt" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.865205 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-242s6"] Feb 19 19:30:37 crc kubenswrapper[4722]: E0219 19:30:37.865517 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6513190-cf4a-405f-a7ca-c35f37d63725" containerName="collect-profiles" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.865534 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6513190-cf4a-405f-a7ca-c35f37d63725" containerName="collect-profiles" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.865692 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6513190-cf4a-405f-a7ca-c35f37d63725" containerName="collect-profiles" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.866196 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-242s6" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.872948 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-fz7bp"] Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.873811 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-fz7bp" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.873934 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.873960 4722 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-5qm5q" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.874045 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.879689 4722 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-lr45c" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.884124 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-242s6"] Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.887325 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-fz7bp"] Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.902955 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hzrck"] Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.903595 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-hzrck" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.907175 4722 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-6qg99" Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.920816 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hzrck"] Feb 19 19:30:37 crc kubenswrapper[4722]: I0219 19:30:37.936378 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngfvc\" (UniqueName: \"kubernetes.io/projected/9545d522-f459-4b98-ac7f-d107189b7497-kube-api-access-ngfvc\") pod \"cert-manager-858654f9db-fz7bp\" (UID: \"9545d522-f459-4b98-ac7f-d107189b7497\") " pod="cert-manager/cert-manager-858654f9db-fz7bp" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.038639 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngfvc\" (UniqueName: \"kubernetes.io/projected/9545d522-f459-4b98-ac7f-d107189b7497-kube-api-access-ngfvc\") pod \"cert-manager-858654f9db-fz7bp\" (UID: \"9545d522-f459-4b98-ac7f-d107189b7497\") " pod="cert-manager/cert-manager-858654f9db-fz7bp" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.038722 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdcnl\" (UniqueName: \"kubernetes.io/projected/e49e50d8-05f3-42f4-a03a-f3a750e1a134-kube-api-access-kdcnl\") pod \"cert-manager-webhook-687f57d79b-hzrck\" (UID: \"e49e50d8-05f3-42f4-a03a-f3a750e1a134\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hzrck" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.038828 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jbft\" (UniqueName: \"kubernetes.io/projected/b1356eef-86bd-4fbf-beb6-a98cd8bc60b8-kube-api-access-6jbft\") pod \"cert-manager-cainjector-cf98fcc89-242s6\" (UID: \"b1356eef-86bd-4fbf-beb6-a98cd8bc60b8\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-242s6" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.056053 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngfvc\" (UniqueName: \"kubernetes.io/projected/9545d522-f459-4b98-ac7f-d107189b7497-kube-api-access-ngfvc\") pod \"cert-manager-858654f9db-fz7bp\" (UID: \"9545d522-f459-4b98-ac7f-d107189b7497\") " pod="cert-manager/cert-manager-858654f9db-fz7bp" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.140441 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdcnl\" (UniqueName: \"kubernetes.io/projected/e49e50d8-05f3-42f4-a03a-f3a750e1a134-kube-api-access-kdcnl\") pod \"cert-manager-webhook-687f57d79b-hzrck\" (UID: \"e49e50d8-05f3-42f4-a03a-f3a750e1a134\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hzrck" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.140497 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jbft\" (UniqueName: \"kubernetes.io/projected/b1356eef-86bd-4fbf-beb6-a98cd8bc60b8-kube-api-access-6jbft\") pod \"cert-manager-cainjector-cf98fcc89-242s6\" (UID: \"b1356eef-86bd-4fbf-beb6-a98cd8bc60b8\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-242s6" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.158490 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdcnl\" (UniqueName: \"kubernetes.io/projected/e49e50d8-05f3-42f4-a03a-f3a750e1a134-kube-api-access-kdcnl\") pod \"cert-manager-webhook-687f57d79b-hzrck\" (UID: \"e49e50d8-05f3-42f4-a03a-f3a750e1a134\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hzrck" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.158600 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jbft\" (UniqueName: \"kubernetes.io/projected/b1356eef-86bd-4fbf-beb6-a98cd8bc60b8-kube-api-access-6jbft\") pod \"cert-manager-cainjector-cf98fcc89-242s6\" (UID: \"b1356eef-86bd-4fbf-beb6-a98cd8bc60b8\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-242s6" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.189119 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-242s6" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.199936 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-fz7bp" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.222086 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-hzrck" Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.516012 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-242s6"] Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.519065 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-fz7bp"] Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.529699 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hzrck"] Feb 19 19:30:38 crc kubenswrapper[4722]: W0219 19:30:38.534461 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode49e50d8_05f3_42f4_a03a_f3a750e1a134.slice/crio-92bacd70b994d955862cfbb497c96d1d3fef5e47402303e24bccfbe510cd3037 WatchSource:0}: Error finding container 92bacd70b994d955862cfbb497c96d1d3fef5e47402303e24bccfbe510cd3037: Status 404 returned error can't find the container with id 92bacd70b994d955862cfbb497c96d1d3fef5e47402303e24bccfbe510cd3037 Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.738717 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-fz7bp" event={"ID":"9545d522-f459-4b98-ac7f-d107189b7497","Type":"ContainerStarted","Data":"b67c9229671b56ba1322ed5dbbdf35f841cd1e67d76f661e68ebc4ef19ec5b05"} Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.740287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-hzrck" event={"ID":"e49e50d8-05f3-42f4-a03a-f3a750e1a134","Type":"ContainerStarted","Data":"92bacd70b994d955862cfbb497c96d1d3fef5e47402303e24bccfbe510cd3037"} Feb 19 19:30:38 crc kubenswrapper[4722]: I0219 19:30:38.741512 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-242s6" event={"ID":"b1356eef-86bd-4fbf-beb6-a98cd8bc60b8","Type":"ContainerStarted","Data":"2fc37e4eeae0a989d8d4667e09ed7712db1c39c2e444c38a29d03e0b4c7409b1"} Feb 19 19:30:42 crc kubenswrapper[4722]: I0219 19:30:42.772126 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-242s6" event={"ID":"b1356eef-86bd-4fbf-beb6-a98cd8bc60b8","Type":"ContainerStarted","Data":"5e3fa562f34cd8581894b729c28d1b5f25ce28e100318a0efe44f385cee0bae8"} Feb 19 19:30:42 crc kubenswrapper[4722]: I0219 19:30:42.795866 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-fz7bp" event={"ID":"9545d522-f459-4b98-ac7f-d107189b7497","Type":"ContainerStarted","Data":"48c1d84017602a58e21e5afdbfaab054fb62192b6c7ff091473bfb95d08326b1"} Feb 19 19:30:42 crc kubenswrapper[4722]: I0219 19:30:42.822802 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-fz7bp" podStartSLOduration=1.899098623 podStartE2EDuration="5.822785793s" podCreationTimestamp="2026-02-19 19:30:37 +0000 UTC" firstStartedPulling="2026-02-19 19:30:38.53478162 +0000 UTC m=+738.147131944" lastFinishedPulling="2026-02-19 19:30:42.4584688 +0000 UTC m=+742.070819114" observedRunningTime="2026-02-19 19:30:42.822439342 +0000 UTC m=+742.434789666" watchObservedRunningTime="2026-02-19 19:30:42.822785793 +0000 UTC m=+742.435136117" Feb 19 19:30:43 crc kubenswrapper[4722]: I0219 19:30:43.802543 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-hzrck" event={"ID":"e49e50d8-05f3-42f4-a03a-f3a750e1a134","Type":"ContainerStarted","Data":"0044e999c253f899989130bbc67c949975616f078a8a10a7e18cf186c724418e"} Feb 19 19:30:43 crc kubenswrapper[4722]: I0219 19:30:43.802791 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-hzrck" Feb 19 19:30:43 crc kubenswrapper[4722]: I0219 19:30:43.825638 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-hzrck" podStartSLOduration=2.843743537 podStartE2EDuration="6.825613337s" podCreationTimestamp="2026-02-19 19:30:37 +0000 UTC" firstStartedPulling="2026-02-19 19:30:38.535506583 +0000 UTC m=+738.147856897" lastFinishedPulling="2026-02-19 19:30:42.517376373 +0000 UTC m=+742.129726697" observedRunningTime="2026-02-19 19:30:43.823617194 +0000 UTC m=+743.435967518" watchObservedRunningTime="2026-02-19 19:30:43.825613337 +0000 UTC m=+743.437963671" Feb 19 19:30:43 crc kubenswrapper[4722]: I0219 19:30:43.851150 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-242s6" podStartSLOduration=2.91684871 podStartE2EDuration="6.851117739s" podCreationTimestamp="2026-02-19 19:30:37 +0000 UTC" firstStartedPulling="2026-02-19 19:30:38.527463383 +0000 UTC m=+738.139813697" lastFinishedPulling="2026-02-19 19:30:42.461732402 +0000 UTC m=+742.074082726" observedRunningTime="2026-02-19 19:30:43.841038376 +0000 UTC m=+743.453388710" watchObservedRunningTime="2026-02-19 19:30:43.851117739 +0000 UTC m=+743.463468103" Feb 19 19:30:48 crc kubenswrapper[4722]: I0219 19:30:48.225777 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-hzrck" Feb 19 19:30:55 crc kubenswrapper[4722]: I0219 19:30:55.721605 4722 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.280033 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jss6p"] Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.282098 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jss6p"] Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.282232 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.437546 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mlmh\" (UniqueName: \"kubernetes.io/projected/ac91b740-cc99-49ce-bda9-e209dfa22140-kube-api-access-5mlmh\") pod \"redhat-operators-jss6p\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.437598 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-utilities\") pod \"redhat-operators-jss6p\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.437621 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-catalog-content\") pod \"redhat-operators-jss6p\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.539054 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mlmh\" (UniqueName: \"kubernetes.io/projected/ac91b740-cc99-49ce-bda9-e209dfa22140-kube-api-access-5mlmh\") pod \"redhat-operators-jss6p\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.539581 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-utilities\") pod \"redhat-operators-jss6p\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.539793 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-catalog-content\") pod \"redhat-operators-jss6p\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.539982 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-utilities\") pod \"redhat-operators-jss6p\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.540276 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-catalog-content\") pod \"redhat-operators-jss6p\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.565979 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mlmh\" (UniqueName: \"kubernetes.io/projected/ac91b740-cc99-49ce-bda9-e209dfa22140-kube-api-access-5mlmh\") pod \"redhat-operators-jss6p\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.598739 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.799694 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.800058 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.813333 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jss6p"] Feb 19 19:31:11 crc kubenswrapper[4722]: W0219 19:31:11.846587 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac91b740_cc99_49ce_bda9_e209dfa22140.slice/crio-e8f9a17191d1c994b4eccb17d54e99e1c9b07621dd90301a1e81fb55767e45fb WatchSource:0}: Error finding container e8f9a17191d1c994b4eccb17d54e99e1c9b07621dd90301a1e81fb55767e45fb: Status 404 returned error can't find the container with id e8f9a17191d1c994b4eccb17d54e99e1c9b07621dd90301a1e81fb55767e45fb Feb 19 19:31:11 crc kubenswrapper[4722]: I0219 19:31:11.989390 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jss6p" event={"ID":"ac91b740-cc99-49ce-bda9-e209dfa22140","Type":"ContainerStarted","Data":"e8f9a17191d1c994b4eccb17d54e99e1c9b07621dd90301a1e81fb55767e45fb"} Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.702220 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2"] Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.703441 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.706173 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.719816 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2"] Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.758281 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb2gz\" (UniqueName: \"kubernetes.io/projected/4f50f1aa-154d-409a-826d-c6c4b3c75559-kube-api-access-jb2gz\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.758335 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.758392 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.859165 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.859559 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb2gz\" (UniqueName: \"kubernetes.io/projected/4f50f1aa-154d-409a-826d-c6c4b3c75559-kube-api-access-jb2gz\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.859613 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.859700 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.859947 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.878785 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb2gz\" (UniqueName: \"kubernetes.io/projected/4f50f1aa-154d-409a-826d-c6c4b3c75559-kube-api-access-jb2gz\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.996993 4722 generic.go:334] "Generic (PLEG): container finished" podID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerID="01d1ca82849ceb3bfa6f7f0cd551b947f9fd07718242afa579460d54fe5bc317" exitCode=0 Feb 19 19:31:12 crc kubenswrapper[4722]: I0219 19:31:12.997039 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jss6p" event={"ID":"ac91b740-cc99-49ce-bda9-e209dfa22140","Type":"ContainerDied","Data":"01d1ca82849ceb3bfa6f7f0cd551b947f9fd07718242afa579460d54fe5bc317"} Feb 19 19:31:13 crc kubenswrapper[4722]: I0219 19:31:13.015961 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:13 crc kubenswrapper[4722]: I0219 19:31:13.193135 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2"] Feb 19 19:31:13 crc kubenswrapper[4722]: W0219 19:31:13.207039 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f50f1aa_154d_409a_826d_c6c4b3c75559.slice/crio-e965cd989c871d690e2430e757028ef7e540c2bc2d13dbbcb0ecc2c8d3a24aa4 WatchSource:0}: Error finding container e965cd989c871d690e2430e757028ef7e540c2bc2d13dbbcb0ecc2c8d3a24aa4: Status 404 returned error can't find the container with id e965cd989c871d690e2430e757028ef7e540c2bc2d13dbbcb0ecc2c8d3a24aa4 Feb 19 19:31:14 crc kubenswrapper[4722]: I0219 19:31:14.009423 4722 generic.go:334] "Generic (PLEG): container finished" podID="4f50f1aa-154d-409a-826d-c6c4b3c75559" containerID="eb560083463bd32e81a3eb16d0b0a9b35ef46b2e728c36e011ce0b153ac00cb5" exitCode=0 Feb 19 19:31:14 crc kubenswrapper[4722]: I0219 19:31:14.009482 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" event={"ID":"4f50f1aa-154d-409a-826d-c6c4b3c75559","Type":"ContainerDied","Data":"eb560083463bd32e81a3eb16d0b0a9b35ef46b2e728c36e011ce0b153ac00cb5"} Feb 19 19:31:14 crc kubenswrapper[4722]: I0219 19:31:14.009543 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" event={"ID":"4f50f1aa-154d-409a-826d-c6c4b3c75559","Type":"ContainerStarted","Data":"e965cd989c871d690e2430e757028ef7e540c2bc2d13dbbcb0ecc2c8d3a24aa4"} Feb 19 19:31:14 crc kubenswrapper[4722]: I0219 19:31:14.013323 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jss6p" event={"ID":"ac91b740-cc99-49ce-bda9-e209dfa22140","Type":"ContainerStarted","Data":"8581b7f0946c1fe9f056107f35f0df54a5f8bc1c71c983b546735fbf76da75ad"} Feb 19 19:31:14 crc kubenswrapper[4722]: I0219 19:31:14.995790 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Feb 19 19:31:14 crc kubenswrapper[4722]: I0219 19:31:14.997761 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 19 19:31:14 crc kubenswrapper[4722]: I0219 19:31:14.999718 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.003984 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.004196 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.026864 4722 generic.go:334] "Generic (PLEG): container finished" podID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerID="8581b7f0946c1fe9f056107f35f0df54a5f8bc1c71c983b546735fbf76da75ad" exitCode=0 Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.026938 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jss6p" event={"ID":"ac91b740-cc99-49ce-bda9-e209dfa22140","Type":"ContainerDied","Data":"8581b7f0946c1fe9f056107f35f0df54a5f8bc1c71c983b546735fbf76da75ad"} Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.191635 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c509f57d-5413-482f-ba7f-0951d0e036e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c509f57d-5413-482f-ba7f-0951d0e036e2\") pod \"minio\" (UID: \"7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3\") " pod="minio-dev/minio" Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.191679 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qdzk\" (UniqueName: \"kubernetes.io/projected/7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3-kube-api-access-2qdzk\") pod \"minio\" (UID: \"7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3\") " pod="minio-dev/minio" Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.293078 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c509f57d-5413-482f-ba7f-0951d0e036e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c509f57d-5413-482f-ba7f-0951d0e036e2\") pod \"minio\" (UID: \"7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3\") " pod="minio-dev/minio" Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.293124 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qdzk\" (UniqueName: \"kubernetes.io/projected/7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3-kube-api-access-2qdzk\") pod \"minio\" (UID: \"7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3\") " pod="minio-dev/minio" Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.296513 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.296550 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c509f57d-5413-482f-ba7f-0951d0e036e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c509f57d-5413-482f-ba7f-0951d0e036e2\") pod \"minio\" (UID: \"7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2c18b383742e5c06cff472d15916597dce353e89b4c5777862b9dc0b774bf042/globalmount\"" pod="minio-dev/minio" Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.316227 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qdzk\" (UniqueName: \"kubernetes.io/projected/7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3-kube-api-access-2qdzk\") pod \"minio\" (UID: \"7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3\") " pod="minio-dev/minio" Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.317198 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c509f57d-5413-482f-ba7f-0951d0e036e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c509f57d-5413-482f-ba7f-0951d0e036e2\") pod \"minio\" (UID: \"7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3\") " pod="minio-dev/minio" Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.612688 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 19 19:31:15 crc kubenswrapper[4722]: I0219 19:31:15.897686 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 19 19:31:15 crc kubenswrapper[4722]: W0219 19:31:15.906085 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f22aa66_46c7_4d3c_8a69_4d67e2dcaec3.slice/crio-2bd5083cf187a8313b76fc56920ab9cdacfbb50bf7f83f2ff3bd5f3963137111 WatchSource:0}: Error finding container 2bd5083cf187a8313b76fc56920ab9cdacfbb50bf7f83f2ff3bd5f3963137111: Status 404 returned error can't find the container with id 2bd5083cf187a8313b76fc56920ab9cdacfbb50bf7f83f2ff3bd5f3963137111 Feb 19 19:31:16 crc kubenswrapper[4722]: I0219 19:31:16.032500 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3","Type":"ContainerStarted","Data":"2bd5083cf187a8313b76fc56920ab9cdacfbb50bf7f83f2ff3bd5f3963137111"} Feb 19 19:31:17 crc kubenswrapper[4722]: I0219 19:31:17.043959 4722 generic.go:334] "Generic (PLEG): container finished" podID="4f50f1aa-154d-409a-826d-c6c4b3c75559" containerID="33a6f974d138e43146af7cf2af1e131660c57e0192bce7040ef9d5b0386220b3" exitCode=0 Feb 19 19:31:17 crc kubenswrapper[4722]: I0219 19:31:17.044238 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" event={"ID":"4f50f1aa-154d-409a-826d-c6c4b3c75559","Type":"ContainerDied","Data":"33a6f974d138e43146af7cf2af1e131660c57e0192bce7040ef9d5b0386220b3"} Feb 19 19:31:17 crc kubenswrapper[4722]: I0219 19:31:17.046975 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jss6p" event={"ID":"ac91b740-cc99-49ce-bda9-e209dfa22140","Type":"ContainerStarted","Data":"bd98111959a56e8d8638f1af184532697b89ba2a2da7bc8aefdfc81e4f25d7b5"} Feb 19 19:31:17 crc kubenswrapper[4722]: I0219 19:31:17.080808 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jss6p" podStartSLOduration=3.161405227 podStartE2EDuration="6.080794147s" podCreationTimestamp="2026-02-19 19:31:11 +0000 UTC" firstStartedPulling="2026-02-19 19:31:12.998533044 +0000 UTC m=+772.610883368" lastFinishedPulling="2026-02-19 19:31:15.917921954 +0000 UTC m=+775.530272288" observedRunningTime="2026-02-19 19:31:17.078708102 +0000 UTC m=+776.691058426" watchObservedRunningTime="2026-02-19 19:31:17.080794147 +0000 UTC m=+776.693144471" Feb 19 19:31:18 crc kubenswrapper[4722]: I0219 19:31:18.054046 4722 generic.go:334] "Generic (PLEG): container finished" podID="4f50f1aa-154d-409a-826d-c6c4b3c75559" containerID="7bd1b3484ee9fb3dff98e5657392c34e57d5cc3cb981bb43d8b31f6f90ed8f2d" exitCode=0 Feb 19 19:31:18 crc kubenswrapper[4722]: I0219 19:31:18.054390 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" event={"ID":"4f50f1aa-154d-409a-826d-c6c4b3c75559","Type":"ContainerDied","Data":"7bd1b3484ee9fb3dff98e5657392c34e57d5cc3cb981bb43d8b31f6f90ed8f2d"} Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.060849 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"7f22aa66-46c7-4d3c-8a69-4d67e2dcaec3","Type":"ContainerStarted","Data":"10b59a7d3bcea6edd529a61d6ddb1239519bf3cbfbf06ffef4e16163a5132d83"} Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.083197 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.124218441 podStartE2EDuration="7.083175553s" podCreationTimestamp="2026-02-19 19:31:12 +0000 UTC" firstStartedPulling="2026-02-19 19:31:15.917370237 +0000 UTC m=+775.529720571" lastFinishedPulling="2026-02-19 19:31:18.876327359 +0000 UTC m=+778.488677683" observedRunningTime="2026-02-19 19:31:19.075242226 +0000 UTC m=+778.687592550" watchObservedRunningTime="2026-02-19 19:31:19.083175553 +0000 UTC m=+778.695525877" Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.316789 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.449811 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-bundle\") pod \"4f50f1aa-154d-409a-826d-c6c4b3c75559\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.450066 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-util\") pod \"4f50f1aa-154d-409a-826d-c6c4b3c75559\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.450203 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb2gz\" (UniqueName: \"kubernetes.io/projected/4f50f1aa-154d-409a-826d-c6c4b3c75559-kube-api-access-jb2gz\") pod \"4f50f1aa-154d-409a-826d-c6c4b3c75559\" (UID: \"4f50f1aa-154d-409a-826d-c6c4b3c75559\") " Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.450975 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-bundle" (OuterVolumeSpecName: "bundle") pod "4f50f1aa-154d-409a-826d-c6c4b3c75559" (UID: "4f50f1aa-154d-409a-826d-c6c4b3c75559"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.459996 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-util" (OuterVolumeSpecName: "util") pod "4f50f1aa-154d-409a-826d-c6c4b3c75559" (UID: "4f50f1aa-154d-409a-826d-c6c4b3c75559"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.460463 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f50f1aa-154d-409a-826d-c6c4b3c75559-kube-api-access-jb2gz" (OuterVolumeSpecName: "kube-api-access-jb2gz") pod "4f50f1aa-154d-409a-826d-c6c4b3c75559" (UID: "4f50f1aa-154d-409a-826d-c6c4b3c75559"). InnerVolumeSpecName "kube-api-access-jb2gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.551868 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.551905 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f50f1aa-154d-409a-826d-c6c4b3c75559-util\") on node \"crc\" DevicePath \"\"" Feb 19 19:31:19 crc kubenswrapper[4722]: I0219 19:31:19.551917 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb2gz\" (UniqueName: \"kubernetes.io/projected/4f50f1aa-154d-409a-826d-c6c4b3c75559-kube-api-access-jb2gz\") on node \"crc\" DevicePath \"\"" Feb 19 19:31:20 crc kubenswrapper[4722]: I0219 19:31:20.068725 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" event={"ID":"4f50f1aa-154d-409a-826d-c6c4b3c75559","Type":"ContainerDied","Data":"e965cd989c871d690e2430e757028ef7e540c2bc2d13dbbcb0ecc2c8d3a24aa4"} Feb 19 19:31:20 crc kubenswrapper[4722]: I0219 19:31:20.069103 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e965cd989c871d690e2430e757028ef7e540c2bc2d13dbbcb0ecc2c8d3a24aa4" Feb 19 19:31:20 crc kubenswrapper[4722]: I0219 19:31:20.068757 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2" Feb 19 19:31:21 crc kubenswrapper[4722]: I0219 19:31:21.599763 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:21 crc kubenswrapper[4722]: I0219 19:31:21.599805 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:22 crc kubenswrapper[4722]: I0219 19:31:22.658544 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jss6p" podUID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerName="registry-server" probeResult="failure" output=< Feb 19 19:31:22 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 19 19:31:22 crc kubenswrapper[4722]: > Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.881985 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df"] Feb 19 19:31:25 crc kubenswrapper[4722]: E0219 19:31:25.882531 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f50f1aa-154d-409a-826d-c6c4b3c75559" containerName="pull" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.882546 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f50f1aa-154d-409a-826d-c6c4b3c75559" containerName="pull" Feb 19 19:31:25 crc kubenswrapper[4722]: E0219 19:31:25.882568 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f50f1aa-154d-409a-826d-c6c4b3c75559" containerName="util" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.882575 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f50f1aa-154d-409a-826d-c6c4b3c75559" containerName="util" Feb 19 19:31:25 crc kubenswrapper[4722]: E0219 19:31:25.882587 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f50f1aa-154d-409a-826d-c6c4b3c75559" containerName="extract" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.882598 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f50f1aa-154d-409a-826d-c6c4b3c75559" containerName="extract" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.882707 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f50f1aa-154d-409a-826d-c6c4b3c75559" containerName="extract" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.884348 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.891825 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.891825 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.891907 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.892138 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.902504 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-zpvtt" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.906228 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Feb 19 19:31:25 crc kubenswrapper[4722]: I0219 19:31:25.914580 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df"] Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.048133 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-manager-config\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.048262 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.048311 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-webhook-cert\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.048344 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-apiservice-cert\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.048450 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psgsb\" (UniqueName: \"kubernetes.io/projected/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-kube-api-access-psgsb\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.149409 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-manager-config\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.149468 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.149497 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-webhook-cert\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.149522 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-apiservice-cert\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.149542 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psgsb\" (UniqueName: \"kubernetes.io/projected/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-kube-api-access-psgsb\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.150977 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-manager-config\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.155784 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-apiservice-cert\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.157420 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.157666 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-webhook-cert\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.168115 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psgsb\" (UniqueName: \"kubernetes.io/projected/9a86c9a2-6e06-48f9-b266-1a47a3bb4fda-kube-api-access-psgsb\") pod \"loki-operator-controller-manager-5dddbf65fc-6c7df\" (UID: \"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.216418 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:26 crc kubenswrapper[4722]: I0219 19:31:26.640277 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df"] Feb 19 19:31:27 crc kubenswrapper[4722]: I0219 19:31:27.108806 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" event={"ID":"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda","Type":"ContainerStarted","Data":"dfcdd4446bf858ddbcbafc5ef4e299589c41e0a6c5fe10920a0d133701b9d861"} Feb 19 19:31:31 crc kubenswrapper[4722]: I0219 19:31:31.651830 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:31 crc kubenswrapper[4722]: I0219 19:31:31.716896 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:32 crc kubenswrapper[4722]: I0219 19:31:32.138464 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" event={"ID":"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda","Type":"ContainerStarted","Data":"defeaf615a303d690e3a8a05bc148bd973ec1e75c2e76fe14f14c0734b426ee8"} Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.041488 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jss6p"] Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.042767 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jss6p" podUID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerName="registry-server" containerID="cri-o://bd98111959a56e8d8638f1af184532697b89ba2a2da7bc8aefdfc81e4f25d7b5" gracePeriod=2 Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.219809 4722 generic.go:334] "Generic (PLEG): container finished" podID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerID="bd98111959a56e8d8638f1af184532697b89ba2a2da7bc8aefdfc81e4f25d7b5" exitCode=0 Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.219852 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jss6p" event={"ID":"ac91b740-cc99-49ce-bda9-e209dfa22140","Type":"ContainerDied","Data":"bd98111959a56e8d8638f1af184532697b89ba2a2da7bc8aefdfc81e4f25d7b5"} Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.493080 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.689803 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mlmh\" (UniqueName: \"kubernetes.io/projected/ac91b740-cc99-49ce-bda9-e209dfa22140-kube-api-access-5mlmh\") pod \"ac91b740-cc99-49ce-bda9-e209dfa22140\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.690980 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-utilities\") pod \"ac91b740-cc99-49ce-bda9-e209dfa22140\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.691088 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-catalog-content\") pod \"ac91b740-cc99-49ce-bda9-e209dfa22140\" (UID: \"ac91b740-cc99-49ce-bda9-e209dfa22140\") " Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.695913 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac91b740-cc99-49ce-bda9-e209dfa22140-kube-api-access-5mlmh" (OuterVolumeSpecName: "kube-api-access-5mlmh") pod "ac91b740-cc99-49ce-bda9-e209dfa22140" (UID: "ac91b740-cc99-49ce-bda9-e209dfa22140"). InnerVolumeSpecName "kube-api-access-5mlmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.701929 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-utilities" (OuterVolumeSpecName: "utilities") pod "ac91b740-cc99-49ce-bda9-e209dfa22140" (UID: "ac91b740-cc99-49ce-bda9-e209dfa22140"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.791806 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mlmh\" (UniqueName: \"kubernetes.io/projected/ac91b740-cc99-49ce-bda9-e209dfa22140-kube-api-access-5mlmh\") on node \"crc\" DevicePath \"\"" Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.792377 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.809842 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac91b740-cc99-49ce-bda9-e209dfa22140" (UID: "ac91b740-cc99-49ce-bda9-e209dfa22140"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:31:34 crc kubenswrapper[4722]: I0219 19:31:34.896282 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac91b740-cc99-49ce-bda9-e209dfa22140-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:31:35 crc kubenswrapper[4722]: I0219 19:31:35.230745 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jss6p" event={"ID":"ac91b740-cc99-49ce-bda9-e209dfa22140","Type":"ContainerDied","Data":"e8f9a17191d1c994b4eccb17d54e99e1c9b07621dd90301a1e81fb55767e45fb"} Feb 19 19:31:35 crc kubenswrapper[4722]: I0219 19:31:35.230803 4722 scope.go:117] "RemoveContainer" containerID="bd98111959a56e8d8638f1af184532697b89ba2a2da7bc8aefdfc81e4f25d7b5" Feb 19 19:31:35 crc kubenswrapper[4722]: I0219 19:31:35.230838 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jss6p" Feb 19 19:31:35 crc kubenswrapper[4722]: I0219 19:31:35.250663 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jss6p"] Feb 19 19:31:35 crc kubenswrapper[4722]: I0219 19:31:35.257397 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jss6p"] Feb 19 19:31:37 crc kubenswrapper[4722]: I0219 19:31:37.081823 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac91b740-cc99-49ce-bda9-e209dfa22140" path="/var/lib/kubelet/pods/ac91b740-cc99-49ce-bda9-e209dfa22140/volumes" Feb 19 19:31:37 crc kubenswrapper[4722]: I0219 19:31:37.276293 4722 scope.go:117] "RemoveContainer" containerID="8581b7f0946c1fe9f056107f35f0df54a5f8bc1c71c983b546735fbf76da75ad" Feb 19 19:31:38 crc kubenswrapper[4722]: I0219 19:31:38.073772 4722 scope.go:117] "RemoveContainer" containerID="01d1ca82849ceb3bfa6f7f0cd551b947f9fd07718242afa579460d54fe5bc317" Feb 19 19:31:39 crc kubenswrapper[4722]: I0219 19:31:39.255947 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" event={"ID":"9a86c9a2-6e06-48f9-b266-1a47a3bb4fda","Type":"ContainerStarted","Data":"50cf33f9ef0af83ea86e54b9f79085493210ffbf3ad29f62bb735aa9b1c53483"} Feb 19 19:31:39 crc kubenswrapper[4722]: I0219 19:31:39.256470 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:39 crc kubenswrapper[4722]: I0219 19:31:39.262224 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" Feb 19 19:31:39 crc kubenswrapper[4722]: I0219 19:31:39.290206 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-5dddbf65fc-6c7df" podStartSLOduration=2.783032519 podStartE2EDuration="14.290184538s" podCreationTimestamp="2026-02-19 19:31:25 +0000 UTC" firstStartedPulling="2026-02-19 19:31:26.64438746 +0000 UTC m=+786.256737794" lastFinishedPulling="2026-02-19 19:31:38.151539489 +0000 UTC m=+797.763889813" observedRunningTime="2026-02-19 19:31:39.289009162 +0000 UTC m=+798.901359496" watchObservedRunningTime="2026-02-19 19:31:39.290184538 +0000 UTC m=+798.902534902" Feb 19 19:31:41 crc kubenswrapper[4722]: I0219 19:31:41.798065 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:31:41 crc kubenswrapper[4722]: I0219 19:31:41.798119 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.967772 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2"] Feb 19 19:32:10 crc kubenswrapper[4722]: E0219 19:32:10.969304 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerName="extract-utilities" Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.969381 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerName="extract-utilities" Feb 19 19:32:10 crc kubenswrapper[4722]: E0219 19:32:10.969452 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerName="extract-content" Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.969511 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerName="extract-content" Feb 19 19:32:10 crc kubenswrapper[4722]: E0219 19:32:10.969563 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerName="registry-server" Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.969612 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerName="registry-server" Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.969755 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac91b740-cc99-49ce-bda9-e209dfa22140" containerName="registry-server" Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.970550 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.972690 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.979415 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2"] Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.980274 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.980388 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxpmv\" (UniqueName: \"kubernetes.io/projected/9e5779bd-c885-4bc1-8f8d-924b571e2851-kube-api-access-dxpmv\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:10 crc kubenswrapper[4722]: I0219 19:32:10.980433 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.082431 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxpmv\" (UniqueName: \"kubernetes.io/projected/9e5779bd-c885-4bc1-8f8d-924b571e2851-kube-api-access-dxpmv\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.082784 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.082847 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.083393 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.083831 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.105864 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxpmv\" (UniqueName: \"kubernetes.io/projected/9e5779bd-c885-4bc1-8f8d-924b571e2851-kube-api-access-dxpmv\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.288992 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.798078 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.798472 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.798514 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.798920 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"66078169c6e38cc91acddc273dfade3d624308d325857d7f5a0c20b40b5ebc84"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.798975 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://66078169c6e38cc91acddc273dfade3d624308d325857d7f5a0c20b40b5ebc84" gracePeriod=600 Feb 19 19:32:11 crc kubenswrapper[4722]: I0219 19:32:11.868097 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2"] Feb 19 19:32:11 crc kubenswrapper[4722]: W0219 19:32:11.876565 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e5779bd_c885_4bc1_8f8d_924b571e2851.slice/crio-901462bd7690422c0cd26bff53eb37c9f8843a696320c7c7e73d2d7060ce3671 WatchSource:0}: Error finding container 901462bd7690422c0cd26bff53eb37c9f8843a696320c7c7e73d2d7060ce3671: Status 404 returned error can't find the container with id 901462bd7690422c0cd26bff53eb37c9f8843a696320c7c7e73d2d7060ce3671 Feb 19 19:32:12 crc kubenswrapper[4722]: I0219 19:32:12.492981 4722 generic.go:334] "Generic (PLEG): container finished" podID="9e5779bd-c885-4bc1-8f8d-924b571e2851" containerID="9d50fcdd98a825fa5ed4491ac4173ffc494f21622f9c50f1386e71c38a51761f" exitCode=0 Feb 19 19:32:12 crc kubenswrapper[4722]: I0219 19:32:12.493075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" event={"ID":"9e5779bd-c885-4bc1-8f8d-924b571e2851","Type":"ContainerDied","Data":"9d50fcdd98a825fa5ed4491ac4173ffc494f21622f9c50f1386e71c38a51761f"} Feb 19 19:32:12 crc kubenswrapper[4722]: I0219 19:32:12.493381 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" event={"ID":"9e5779bd-c885-4bc1-8f8d-924b571e2851","Type":"ContainerStarted","Data":"901462bd7690422c0cd26bff53eb37c9f8843a696320c7c7e73d2d7060ce3671"} Feb 19 19:32:12 crc kubenswrapper[4722]: I0219 19:32:12.495165 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="66078169c6e38cc91acddc273dfade3d624308d325857d7f5a0c20b40b5ebc84" exitCode=0 Feb 19 19:32:12 crc kubenswrapper[4722]: I0219 19:32:12.495177 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"66078169c6e38cc91acddc273dfade3d624308d325857d7f5a0c20b40b5ebc84"} Feb 19 19:32:12 crc kubenswrapper[4722]: I0219 19:32:12.495215 4722 scope.go:117] "RemoveContainer" containerID="ed4098cbee7574ff3d9c55b78db4cadcd44467488f62dc621d61b36a474cc23c" Feb 19 19:32:13 crc kubenswrapper[4722]: I0219 19:32:13.501362 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"d8ceb58059028fac39dbad274e30d4a3cfc17b7b996b2c7fee64b6d0dd4a36f1"} Feb 19 19:32:14 crc kubenswrapper[4722]: I0219 19:32:14.511037 4722 generic.go:334] "Generic (PLEG): container finished" podID="9e5779bd-c885-4bc1-8f8d-924b571e2851" containerID="07d83f2a8566df6f2cb63f1cee655e08614f8aab331873f14d7ed91c61dc7276" exitCode=0 Feb 19 19:32:14 crc kubenswrapper[4722]: I0219 19:32:14.511190 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" event={"ID":"9e5779bd-c885-4bc1-8f8d-924b571e2851","Type":"ContainerDied","Data":"07d83f2a8566df6f2cb63f1cee655e08614f8aab331873f14d7ed91c61dc7276"} Feb 19 19:32:15 crc kubenswrapper[4722]: I0219 19:32:15.522143 4722 generic.go:334] "Generic (PLEG): container finished" podID="9e5779bd-c885-4bc1-8f8d-924b571e2851" containerID="0a1b4d448647d75e417bb8f0254c921c4479419fa3511312e2dc3f1bd4121724" exitCode=0 Feb 19 19:32:15 crc kubenswrapper[4722]: I0219 19:32:15.522233 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" event={"ID":"9e5779bd-c885-4bc1-8f8d-924b571e2851","Type":"ContainerDied","Data":"0a1b4d448647d75e417bb8f0254c921c4479419fa3511312e2dc3f1bd4121724"} Feb 19 19:32:16 crc kubenswrapper[4722]: I0219 19:32:16.833464 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:16 crc kubenswrapper[4722]: I0219 19:32:16.974423 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-bundle\") pod \"9e5779bd-c885-4bc1-8f8d-924b571e2851\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " Feb 19 19:32:16 crc kubenswrapper[4722]: I0219 19:32:16.974517 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-util\") pod \"9e5779bd-c885-4bc1-8f8d-924b571e2851\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " Feb 19 19:32:16 crc kubenswrapper[4722]: I0219 19:32:16.974601 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxpmv\" (UniqueName: \"kubernetes.io/projected/9e5779bd-c885-4bc1-8f8d-924b571e2851-kube-api-access-dxpmv\") pod \"9e5779bd-c885-4bc1-8f8d-924b571e2851\" (UID: \"9e5779bd-c885-4bc1-8f8d-924b571e2851\") " Feb 19 19:32:16 crc kubenswrapper[4722]: I0219 19:32:16.975103 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-bundle" (OuterVolumeSpecName: "bundle") pod "9e5779bd-c885-4bc1-8f8d-924b571e2851" (UID: "9e5779bd-c885-4bc1-8f8d-924b571e2851"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:32:16 crc kubenswrapper[4722]: I0219 19:32:16.978926 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e5779bd-c885-4bc1-8f8d-924b571e2851-kube-api-access-dxpmv" (OuterVolumeSpecName: "kube-api-access-dxpmv") pod "9e5779bd-c885-4bc1-8f8d-924b571e2851" (UID: "9e5779bd-c885-4bc1-8f8d-924b571e2851"). InnerVolumeSpecName "kube-api-access-dxpmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:32:16 crc kubenswrapper[4722]: I0219 19:32:16.994412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-util" (OuterVolumeSpecName: "util") pod "9e5779bd-c885-4bc1-8f8d-924b571e2851" (UID: "9e5779bd-c885-4bc1-8f8d-924b571e2851"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:32:17 crc kubenswrapper[4722]: I0219 19:32:17.075388 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:32:17 crc kubenswrapper[4722]: I0219 19:32:17.075418 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e5779bd-c885-4bc1-8f8d-924b571e2851-util\") on node \"crc\" DevicePath \"\"" Feb 19 19:32:17 crc kubenswrapper[4722]: I0219 19:32:17.075429 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxpmv\" (UniqueName: \"kubernetes.io/projected/9e5779bd-c885-4bc1-8f8d-924b571e2851-kube-api-access-dxpmv\") on node \"crc\" DevicePath \"\"" Feb 19 19:32:17 crc kubenswrapper[4722]: I0219 19:32:17.538802 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" event={"ID":"9e5779bd-c885-4bc1-8f8d-924b571e2851","Type":"ContainerDied","Data":"901462bd7690422c0cd26bff53eb37c9f8843a696320c7c7e73d2d7060ce3671"} Feb 19 19:32:17 crc kubenswrapper[4722]: I0219 19:32:17.538847 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="901462bd7690422c0cd26bff53eb37c9f8843a696320c7c7e73d2d7060ce3671" Feb 19 19:32:17 crc kubenswrapper[4722]: I0219 19:32:17.538850 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2" Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.779617 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-hclph"] Feb 19 19:32:20 crc kubenswrapper[4722]: E0219 19:32:20.780126 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e5779bd-c885-4bc1-8f8d-924b571e2851" containerName="pull" Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.780139 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e5779bd-c885-4bc1-8f8d-924b571e2851" containerName="pull" Feb 19 19:32:20 crc kubenswrapper[4722]: E0219 19:32:20.780187 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e5779bd-c885-4bc1-8f8d-924b571e2851" containerName="util" Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.780194 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e5779bd-c885-4bc1-8f8d-924b571e2851" containerName="util" Feb 19 19:32:20 crc kubenswrapper[4722]: E0219 19:32:20.780205 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e5779bd-c885-4bc1-8f8d-924b571e2851" containerName="extract" Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.780212 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e5779bd-c885-4bc1-8f8d-924b571e2851" containerName="extract" Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.780301 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e5779bd-c885-4bc1-8f8d-924b571e2851" containerName="extract" Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.780695 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-hclph" Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.783608 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.783843 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.784189 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-jbjc5" Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.796459 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-hclph"] Feb 19 19:32:20 crc kubenswrapper[4722]: I0219 19:32:20.925638 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttjg8\" (UniqueName: \"kubernetes.io/projected/296e010f-202c-4c01-836e-be6c48607e5f-kube-api-access-ttjg8\") pod \"nmstate-operator-694c9596b7-hclph\" (UID: \"296e010f-202c-4c01-836e-be6c48607e5f\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-hclph" Feb 19 19:32:21 crc kubenswrapper[4722]: I0219 19:32:21.027237 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttjg8\" (UniqueName: \"kubernetes.io/projected/296e010f-202c-4c01-836e-be6c48607e5f-kube-api-access-ttjg8\") pod \"nmstate-operator-694c9596b7-hclph\" (UID: \"296e010f-202c-4c01-836e-be6c48607e5f\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-hclph" Feb 19 19:32:21 crc kubenswrapper[4722]: I0219 19:32:21.046434 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 19 19:32:21 crc kubenswrapper[4722]: I0219 19:32:21.056430 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 19 19:32:21 crc kubenswrapper[4722]: I0219 19:32:21.073928 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttjg8\" (UniqueName: \"kubernetes.io/projected/296e010f-202c-4c01-836e-be6c48607e5f-kube-api-access-ttjg8\") pod \"nmstate-operator-694c9596b7-hclph\" (UID: \"296e010f-202c-4c01-836e-be6c48607e5f\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-hclph" Feb 19 19:32:21 crc kubenswrapper[4722]: I0219 19:32:21.105508 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-jbjc5" Feb 19 19:32:21 crc kubenswrapper[4722]: I0219 19:32:21.114783 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-hclph" Feb 19 19:32:21 crc kubenswrapper[4722]: I0219 19:32:21.371805 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-hclph"] Feb 19 19:32:21 crc kubenswrapper[4722]: I0219 19:32:21.565900 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-hclph" event={"ID":"296e010f-202c-4c01-836e-be6c48607e5f","Type":"ContainerStarted","Data":"d5243cccd00de87c4124189f9a339f018587a00e0d850dfad89108f7a6b4ecf3"} Feb 19 19:32:24 crc kubenswrapper[4722]: I0219 19:32:24.587191 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-hclph" event={"ID":"296e010f-202c-4c01-836e-be6c48607e5f","Type":"ContainerStarted","Data":"7bfc2b5f20d83c9f10af90bd43fbf2a133f2fb3f71994ad7bee77b6c7296bb51"} Feb 19 19:32:24 crc kubenswrapper[4722]: I0219 19:32:24.618002 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-hclph" podStartSLOduration=2.557205976 podStartE2EDuration="4.617976198s" podCreationTimestamp="2026-02-19 19:32:20 +0000 UTC" firstStartedPulling="2026-02-19 19:32:21.382650739 +0000 UTC m=+840.995001063" lastFinishedPulling="2026-02-19 19:32:23.443420951 +0000 UTC m=+843.055771285" observedRunningTime="2026-02-19 19:32:24.610566776 +0000 UTC m=+844.222917140" watchObservedRunningTime="2026-02-19 19:32:24.617976198 +0000 UTC m=+844.230326552" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.656279 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr"] Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.657407 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.659737 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-wld46" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.662290 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv"] Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.662965 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.667259 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.671093 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-tvslw"] Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.673487 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.680002 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr"] Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.701676 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv"] Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.782911 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v"] Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.783613 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.786426 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.786716 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-lkr6n" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.787664 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-ovs-socket\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.787704 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-dbus-socket\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.787748 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjd8p\" (UniqueName: \"kubernetes.io/projected/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-kube-api-access-gjd8p\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.787782 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-nmstate-lock\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.787804 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2zlp\" (UniqueName: \"kubernetes.io/projected/f9185385-162a-40a7-9563-3c668080b9e9-kube-api-access-h2zlp\") pod \"nmstate-webhook-866bcb46dc-9jmpv\" (UID: \"f9185385-162a-40a7-9563-3c668080b9e9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.787853 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6nsl\" (UniqueName: \"kubernetes.io/projected/62ed738c-2401-4b21-b6a8-1bc2c1c009ae-kube-api-access-b6nsl\") pod \"nmstate-metrics-58c85c668d-t5lsr\" (UID: \"62ed738c-2401-4b21-b6a8-1bc2c1c009ae\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.787883 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f9185385-162a-40a7-9563-3c668080b9e9-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-9jmpv\" (UID: \"f9185385-162a-40a7-9563-3c668080b9e9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.787998 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.811460 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v"] Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.889633 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed131fa7-525a-481d-83a9-4fef817dc7ce-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-nlx9v\" (UID: \"ed131fa7-525a-481d-83a9-4fef817dc7ce\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.889699 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-nmstate-lock\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.889730 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2zlp\" (UniqueName: \"kubernetes.io/projected/f9185385-162a-40a7-9563-3c668080b9e9-kube-api-access-h2zlp\") pod \"nmstate-webhook-866bcb46dc-9jmpv\" (UID: \"f9185385-162a-40a7-9563-3c668080b9e9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.889763 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6nsl\" (UniqueName: \"kubernetes.io/projected/62ed738c-2401-4b21-b6a8-1bc2c1c009ae-kube-api-access-b6nsl\") pod \"nmstate-metrics-58c85c668d-t5lsr\" (UID: \"62ed738c-2401-4b21-b6a8-1bc2c1c009ae\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.889794 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc4g6\" (UniqueName: \"kubernetes.io/projected/ed131fa7-525a-481d-83a9-4fef817dc7ce-kube-api-access-xc4g6\") pod \"nmstate-console-plugin-5c78fc5d65-nlx9v\" (UID: \"ed131fa7-525a-481d-83a9-4fef817dc7ce\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.889833 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f9185385-162a-40a7-9563-3c668080b9e9-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-9jmpv\" (UID: \"f9185385-162a-40a7-9563-3c668080b9e9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.889859 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-ovs-socket\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.889885 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-dbus-socket\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.889932 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ed131fa7-525a-481d-83a9-4fef817dc7ce-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-nlx9v\" (UID: \"ed131fa7-525a-481d-83a9-4fef817dc7ce\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.889962 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjd8p\" (UniqueName: \"kubernetes.io/projected/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-kube-api-access-gjd8p\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.890351 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-nmstate-lock\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.891433 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-dbus-socket\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.891870 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-ovs-socket\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.896122 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f9185385-162a-40a7-9563-3c668080b9e9-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-9jmpv\" (UID: \"f9185385-162a-40a7-9563-3c668080b9e9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.906393 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2zlp\" (UniqueName: \"kubernetes.io/projected/f9185385-162a-40a7-9563-3c668080b9e9-kube-api-access-h2zlp\") pod \"nmstate-webhook-866bcb46dc-9jmpv\" (UID: \"f9185385-162a-40a7-9563-3c668080b9e9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.906609 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjd8p\" (UniqueName: \"kubernetes.io/projected/59139bb2-e1ae-4f74-96fe-6ea34d232cd9-kube-api-access-gjd8p\") pod \"nmstate-handler-tvslw\" (UID: \"59139bb2-e1ae-4f74-96fe-6ea34d232cd9\") " pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.909175 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6nsl\" (UniqueName: \"kubernetes.io/projected/62ed738c-2401-4b21-b6a8-1bc2c1c009ae-kube-api-access-b6nsl\") pod \"nmstate-metrics-58c85c668d-t5lsr\" (UID: \"62ed738c-2401-4b21-b6a8-1bc2c1c009ae\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.986576 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.989317 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fb979c56d-ddvr6"] Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.990200 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.990777 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ed131fa7-525a-481d-83a9-4fef817dc7ce-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-nlx9v\" (UID: \"ed131fa7-525a-481d-83a9-4fef817dc7ce\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.990838 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed131fa7-525a-481d-83a9-4fef817dc7ce-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-nlx9v\" (UID: \"ed131fa7-525a-481d-83a9-4fef817dc7ce\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.990898 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc4g6\" (UniqueName: \"kubernetes.io/projected/ed131fa7-525a-481d-83a9-4fef817dc7ce-kube-api-access-xc4g6\") pod \"nmstate-console-plugin-5c78fc5d65-nlx9v\" (UID: \"ed131fa7-525a-481d-83a9-4fef817dc7ce\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:25 crc kubenswrapper[4722]: E0219 19:32:25.991063 4722 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 19 19:32:25 crc kubenswrapper[4722]: E0219 19:32:25.991181 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed131fa7-525a-481d-83a9-4fef817dc7ce-plugin-serving-cert podName:ed131fa7-525a-481d-83a9-4fef817dc7ce nodeName:}" failed. No retries permitted until 2026-02-19 19:32:26.491116217 +0000 UTC m=+846.103466541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/ed131fa7-525a-481d-83a9-4fef817dc7ce-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-nlx9v" (UID: "ed131fa7-525a-481d-83a9-4fef817dc7ce") : secret "plugin-serving-cert" not found Feb 19 19:32:25 crc kubenswrapper[4722]: I0219 19:32:25.992074 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ed131fa7-525a-481d-83a9-4fef817dc7ce-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-nlx9v\" (UID: \"ed131fa7-525a-481d-83a9-4fef817dc7ce\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.006436 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fb979c56d-ddvr6"] Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.037930 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.040214 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.047829 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc4g6\" (UniqueName: \"kubernetes.io/projected/ed131fa7-525a-481d-83a9-4fef817dc7ce-kube-api-access-xc4g6\") pod \"nmstate-console-plugin-5c78fc5d65-nlx9v\" (UID: \"ed131fa7-525a-481d-83a9-4fef817dc7ce\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.091950 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/defd195c-f260-424a-8740-be368c4d8e64-console-serving-cert\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.092306 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/defd195c-f260-424a-8740-be368c4d8e64-console-oauth-config\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.092342 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-service-ca\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.092372 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-oauth-serving-cert\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.092417 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-console-config\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.092437 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-trusted-ca-bundle\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.092465 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km8sf\" (UniqueName: \"kubernetes.io/projected/defd195c-f260-424a-8740-be368c4d8e64-kube-api-access-km8sf\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.194066 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/defd195c-f260-424a-8740-be368c4d8e64-console-serving-cert\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.194196 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/defd195c-f260-424a-8740-be368c4d8e64-console-oauth-config\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.194278 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-service-ca\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.194345 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-oauth-serving-cert\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.194456 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-trusted-ca-bundle\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.194511 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-console-config\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.194580 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km8sf\" (UniqueName: \"kubernetes.io/projected/defd195c-f260-424a-8740-be368c4d8e64-kube-api-access-km8sf\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.197281 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-service-ca\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.197429 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-oauth-serving-cert\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.199415 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-console-config\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.199741 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/defd195c-f260-424a-8740-be368c4d8e64-console-oauth-config\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.199976 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/defd195c-f260-424a-8740-be368c4d8e64-console-serving-cert\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.200456 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/defd195c-f260-424a-8740-be368c4d8e64-trusted-ca-bundle\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.216086 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km8sf\" (UniqueName: \"kubernetes.io/projected/defd195c-f260-424a-8740-be368c4d8e64-kube-api-access-km8sf\") pod \"console-6fb979c56d-ddvr6\" (UID: \"defd195c-f260-424a-8740-be368c4d8e64\") " pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: W0219 19:32:26.269759 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9185385_162a_40a7_9563_3c668080b9e9.slice/crio-7655749a940b6859aaa6f30fbc6151897fe34779aefb85134ffc5d3b55c3228d WatchSource:0}: Error finding container 7655749a940b6859aaa6f30fbc6151897fe34779aefb85134ffc5d3b55c3228d: Status 404 returned error can't find the container with id 7655749a940b6859aaa6f30fbc6151897fe34779aefb85134ffc5d3b55c3228d Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.270079 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv"] Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.341693 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.415091 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr"] Feb 19 19:32:26 crc kubenswrapper[4722]: W0219 19:32:26.425402 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62ed738c_2401_4b21_b6a8_1bc2c1c009ae.slice/crio-accd5b739630560da86808c657c07168e2997e6b9ca379582229f5123d466c28 WatchSource:0}: Error finding container accd5b739630560da86808c657c07168e2997e6b9ca379582229f5123d466c28: Status 404 returned error can't find the container with id accd5b739630560da86808c657c07168e2997e6b9ca379582229f5123d466c28 Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.498402 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed131fa7-525a-481d-83a9-4fef817dc7ce-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-nlx9v\" (UID: \"ed131fa7-525a-481d-83a9-4fef817dc7ce\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.504839 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed131fa7-525a-481d-83a9-4fef817dc7ce-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-nlx9v\" (UID: \"ed131fa7-525a-481d-83a9-4fef817dc7ce\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.580301 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fb979c56d-ddvr6"] Feb 19 19:32:26 crc kubenswrapper[4722]: W0219 19:32:26.585281 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddefd195c_f260_424a_8740_be368c4d8e64.slice/crio-ae0afe68dea7cfbff4f66af842f270c41a183e47a331165eabf5af8c30310974 WatchSource:0}: Error finding container ae0afe68dea7cfbff4f66af842f270c41a183e47a331165eabf5af8c30310974: Status 404 returned error can't find the container with id ae0afe68dea7cfbff4f66af842f270c41a183e47a331165eabf5af8c30310974 Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.597688 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" event={"ID":"f9185385-162a-40a7-9563-3c668080b9e9","Type":"ContainerStarted","Data":"7655749a940b6859aaa6f30fbc6151897fe34779aefb85134ffc5d3b55c3228d"} Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.599536 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr" event={"ID":"62ed738c-2401-4b21-b6a8-1bc2c1c009ae","Type":"ContainerStarted","Data":"accd5b739630560da86808c657c07168e2997e6b9ca379582229f5123d466c28"} Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.600476 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fb979c56d-ddvr6" event={"ID":"defd195c-f260-424a-8740-be368c4d8e64","Type":"ContainerStarted","Data":"ae0afe68dea7cfbff4f66af842f270c41a183e47a331165eabf5af8c30310974"} Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.601673 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-tvslw" event={"ID":"59139bb2-e1ae-4f74-96fe-6ea34d232cd9","Type":"ContainerStarted","Data":"eb68fa767886eb1ca3412bb8d8c2dcb200b27b72c65526020742dad4f4733791"} Feb 19 19:32:26 crc kubenswrapper[4722]: I0219 19:32:26.708515 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" Feb 19 19:32:27 crc kubenswrapper[4722]: I0219 19:32:27.166978 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v"] Feb 19 19:32:27 crc kubenswrapper[4722]: W0219 19:32:27.174893 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded131fa7_525a_481d_83a9_4fef817dc7ce.slice/crio-251c11e1a02120e61c4fcf6e6248dbcb63df2a578e2431da8724ecdd32c53eb7 WatchSource:0}: Error finding container 251c11e1a02120e61c4fcf6e6248dbcb63df2a578e2431da8724ecdd32c53eb7: Status 404 returned error can't find the container with id 251c11e1a02120e61c4fcf6e6248dbcb63df2a578e2431da8724ecdd32c53eb7 Feb 19 19:32:27 crc kubenswrapper[4722]: I0219 19:32:27.608212 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" event={"ID":"ed131fa7-525a-481d-83a9-4fef817dc7ce","Type":"ContainerStarted","Data":"251c11e1a02120e61c4fcf6e6248dbcb63df2a578e2431da8724ecdd32c53eb7"} Feb 19 19:32:27 crc kubenswrapper[4722]: I0219 19:32:27.610036 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fb979c56d-ddvr6" event={"ID":"defd195c-f260-424a-8740-be368c4d8e64","Type":"ContainerStarted","Data":"17d85fd0f3f57c1dfcdfebd6b1039b0749f82847f1f7ca46cf88cb343f6fc399"} Feb 19 19:32:29 crc kubenswrapper[4722]: I0219 19:32:29.623249 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" event={"ID":"f9185385-162a-40a7-9563-3c668080b9e9","Type":"ContainerStarted","Data":"b0f581c2a48df492c939cb59b8a40f93b0c5768dd79cbf576b2b46af03eaf9d7"} Feb 19 19:32:29 crc kubenswrapper[4722]: I0219 19:32:29.624681 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" Feb 19 19:32:29 crc kubenswrapper[4722]: I0219 19:32:29.626498 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr" event={"ID":"62ed738c-2401-4b21-b6a8-1bc2c1c009ae","Type":"ContainerStarted","Data":"40b54ede01293a2bee2ff3af26cb470b08b4fa287df75102887c47ce592bc0c7"} Feb 19 19:32:29 crc kubenswrapper[4722]: I0219 19:32:29.628120 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-tvslw" event={"ID":"59139bb2-e1ae-4f74-96fe-6ea34d232cd9","Type":"ContainerStarted","Data":"cbd9cab1a56cecca803ffe0d28cfd983624c3d43e32dce7df5eb02641fc9a290"} Feb 19 19:32:29 crc kubenswrapper[4722]: I0219 19:32:29.628585 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:29 crc kubenswrapper[4722]: I0219 19:32:29.644145 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fb979c56d-ddvr6" podStartSLOduration=4.644126737 podStartE2EDuration="4.644126737s" podCreationTimestamp="2026-02-19 19:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:32:27.641885135 +0000 UTC m=+847.254235479" watchObservedRunningTime="2026-02-19 19:32:29.644126737 +0000 UTC m=+849.256477051" Feb 19 19:32:29 crc kubenswrapper[4722]: I0219 19:32:29.649649 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" podStartSLOduration=2.240042429 podStartE2EDuration="4.649631219s" podCreationTimestamp="2026-02-19 19:32:25 +0000 UTC" firstStartedPulling="2026-02-19 19:32:26.271927301 +0000 UTC m=+845.884277625" lastFinishedPulling="2026-02-19 19:32:28.681516091 +0000 UTC m=+848.293866415" observedRunningTime="2026-02-19 19:32:29.639734951 +0000 UTC m=+849.252085285" watchObservedRunningTime="2026-02-19 19:32:29.649631219 +0000 UTC m=+849.261981543" Feb 19 19:32:29 crc kubenswrapper[4722]: I0219 19:32:29.664614 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-tvslw" podStartSLOduration=2.08516175 podStartE2EDuration="4.664516333s" podCreationTimestamp="2026-02-19 19:32:25 +0000 UTC" firstStartedPulling="2026-02-19 19:32:26.100736984 +0000 UTC m=+845.713087298" lastFinishedPulling="2026-02-19 19:32:28.680091557 +0000 UTC m=+848.292441881" observedRunningTime="2026-02-19 19:32:29.657514275 +0000 UTC m=+849.269864599" watchObservedRunningTime="2026-02-19 19:32:29.664516333 +0000 UTC m=+849.276866657" Feb 19 19:32:30 crc kubenswrapper[4722]: I0219 19:32:30.640619 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" event={"ID":"ed131fa7-525a-481d-83a9-4fef817dc7ce","Type":"ContainerStarted","Data":"905b96133533a1f6cf98023a1c82527ee542932bd35c2639d5ebc67e4f1d586b"} Feb 19 19:32:30 crc kubenswrapper[4722]: I0219 19:32:30.665483 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-nlx9v" podStartSLOduration=3.0373082 podStartE2EDuration="5.665457054s" podCreationTimestamp="2026-02-19 19:32:25 +0000 UTC" firstStartedPulling="2026-02-19 19:32:27.178009965 +0000 UTC m=+846.790360289" lastFinishedPulling="2026-02-19 19:32:29.806158819 +0000 UTC m=+849.418509143" observedRunningTime="2026-02-19 19:32:30.662177572 +0000 UTC m=+850.274527906" watchObservedRunningTime="2026-02-19 19:32:30.665457054 +0000 UTC m=+850.277807398" Feb 19 19:32:32 crc kubenswrapper[4722]: I0219 19:32:32.655660 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr" event={"ID":"62ed738c-2401-4b21-b6a8-1bc2c1c009ae","Type":"ContainerStarted","Data":"1eb4b5b440eaadd46c6dc6572c1b7199002aa8c187d6866815521a60b4fe01c9"} Feb 19 19:32:32 crc kubenswrapper[4722]: I0219 19:32:32.697003 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-t5lsr" podStartSLOduration=2.338724354 podStartE2EDuration="7.68351872s" podCreationTimestamp="2026-02-19 19:32:25 +0000 UTC" firstStartedPulling="2026-02-19 19:32:26.428106619 +0000 UTC m=+846.040456993" lastFinishedPulling="2026-02-19 19:32:31.772901025 +0000 UTC m=+851.385251359" observedRunningTime="2026-02-19 19:32:32.67999752 +0000 UTC m=+852.292347914" watchObservedRunningTime="2026-02-19 19:32:32.68351872 +0000 UTC m=+852.295869074" Feb 19 19:32:36 crc kubenswrapper[4722]: I0219 19:32:36.068642 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-tvslw" Feb 19 19:32:36 crc kubenswrapper[4722]: I0219 19:32:36.342540 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:36 crc kubenswrapper[4722]: I0219 19:32:36.342621 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:36 crc kubenswrapper[4722]: I0219 19:32:36.347892 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:36 crc kubenswrapper[4722]: I0219 19:32:36.690659 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6fb979c56d-ddvr6" Feb 19 19:32:36 crc kubenswrapper[4722]: I0219 19:32:36.770844 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-txlzt"] Feb 19 19:32:46 crc kubenswrapper[4722]: I0219 19:32:46.048960 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-9jmpv" Feb 19 19:32:57 crc kubenswrapper[4722]: I0219 19:32:57.843679 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5t95p"] Feb 19 19:32:57 crc kubenswrapper[4722]: I0219 19:32:57.848585 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:57 crc kubenswrapper[4722]: I0219 19:32:57.873225 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5t95p"] Feb 19 19:32:57 crc kubenswrapper[4722]: I0219 19:32:57.986659 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-utilities\") pod \"community-operators-5t95p\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:57 crc kubenswrapper[4722]: I0219 19:32:57.986719 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-catalog-content\") pod \"community-operators-5t95p\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:57 crc kubenswrapper[4722]: I0219 19:32:57.986797 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c65pq\" (UniqueName: \"kubernetes.io/projected/9e454faa-cee8-4571-88ed-88bb048abe32-kube-api-access-c65pq\") pod \"community-operators-5t95p\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.087953 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c65pq\" (UniqueName: \"kubernetes.io/projected/9e454faa-cee8-4571-88ed-88bb048abe32-kube-api-access-c65pq\") pod \"community-operators-5t95p\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.088320 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-utilities\") pod \"community-operators-5t95p\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.088371 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-catalog-content\") pod \"community-operators-5t95p\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.089168 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-utilities\") pod \"community-operators-5t95p\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.089220 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-catalog-content\") pod \"community-operators-5t95p\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.112055 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c65pq\" (UniqueName: \"kubernetes.io/projected/9e454faa-cee8-4571-88ed-88bb048abe32-kube-api-access-c65pq\") pod \"community-operators-5t95p\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.169631 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.686878 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5t95p"] Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.849857 4722 generic.go:334] "Generic (PLEG): container finished" podID="9e454faa-cee8-4571-88ed-88bb048abe32" containerID="c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2" exitCode=0 Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.849908 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t95p" event={"ID":"9e454faa-cee8-4571-88ed-88bb048abe32","Type":"ContainerDied","Data":"c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2"} Feb 19 19:32:58 crc kubenswrapper[4722]: I0219 19:32:58.849958 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t95p" event={"ID":"9e454faa-cee8-4571-88ed-88bb048abe32","Type":"ContainerStarted","Data":"139202899fc571b300add407645fb64230d610b9266e7028671d6d1cc0159fda"} Feb 19 19:32:59 crc kubenswrapper[4722]: I0219 19:32:59.857258 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t95p" event={"ID":"9e454faa-cee8-4571-88ed-88bb048abe32","Type":"ContainerStarted","Data":"a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191"} Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.252432 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb"] Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.253682 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.255598 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.260909 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb"] Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.427516 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.427597 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.427668 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj62k\" (UniqueName: \"kubernetes.io/projected/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-kube-api-access-pj62k\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.528294 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.528337 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.528368 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj62k\" (UniqueName: \"kubernetes.io/projected/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-kube-api-access-pj62k\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.528723 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.528979 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.547347 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj62k\" (UniqueName: \"kubernetes.io/projected/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-kube-api-access-pj62k\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.569856 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.865376 4722 generic.go:334] "Generic (PLEG): container finished" podID="9e454faa-cee8-4571-88ed-88bb048abe32" containerID="a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191" exitCode=0 Feb 19 19:33:00 crc kubenswrapper[4722]: I0219 19:33:00.865439 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t95p" event={"ID":"9e454faa-cee8-4571-88ed-88bb048abe32","Type":"ContainerDied","Data":"a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191"} Feb 19 19:33:01 crc kubenswrapper[4722]: I0219 19:33:01.008687 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb"] Feb 19 19:33:01 crc kubenswrapper[4722]: I0219 19:33:01.826518 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-txlzt" podUID="187676b8-1029-4153-9da5-6614e9b7892e" containerName="console" containerID="cri-o://25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af" gracePeriod=15 Feb 19 19:33:01 crc kubenswrapper[4722]: I0219 19:33:01.873576 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t95p" event={"ID":"9e454faa-cee8-4571-88ed-88bb048abe32","Type":"ContainerStarted","Data":"78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b"} Feb 19 19:33:01 crc kubenswrapper[4722]: I0219 19:33:01.876011 4722 generic.go:334] "Generic (PLEG): container finished" podID="23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" containerID="33c31fb6cece00c6ce8a25f27e4fbb1f073a2b8beaacb0aa68fbf04528f42ba4" exitCode=0 Feb 19 19:33:01 crc kubenswrapper[4722]: I0219 19:33:01.876053 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" event={"ID":"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72","Type":"ContainerDied","Data":"33c31fb6cece00c6ce8a25f27e4fbb1f073a2b8beaacb0aa68fbf04528f42ba4"} Feb 19 19:33:01 crc kubenswrapper[4722]: I0219 19:33:01.876078 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" event={"ID":"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72","Type":"ContainerStarted","Data":"e509223e66336b2f8aea7b9aeca56637052efbf0782b27bea70243047cae786c"} Feb 19 19:33:01 crc kubenswrapper[4722]: I0219 19:33:01.891900 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5t95p" podStartSLOduration=2.439929147 podStartE2EDuration="4.89188119s" podCreationTimestamp="2026-02-19 19:32:57 +0000 UTC" firstStartedPulling="2026-02-19 19:32:58.852032992 +0000 UTC m=+878.464383316" lastFinishedPulling="2026-02-19 19:33:01.303985035 +0000 UTC m=+880.916335359" observedRunningTime="2026-02-19 19:33:01.890465646 +0000 UTC m=+881.502816000" watchObservedRunningTime="2026-02-19 19:33:01.89188119 +0000 UTC m=+881.504231524" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.228819 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-txlzt_187676b8-1029-4153-9da5-6614e9b7892e/console/0.log" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.229110 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.352649 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqdj9\" (UniqueName: \"kubernetes.io/projected/187676b8-1029-4153-9da5-6614e9b7892e-kube-api-access-hqdj9\") pod \"187676b8-1029-4153-9da5-6614e9b7892e\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.352706 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-service-ca\") pod \"187676b8-1029-4153-9da5-6614e9b7892e\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.352737 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-console-config\") pod \"187676b8-1029-4153-9da5-6614e9b7892e\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.352778 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-oauth-config\") pod \"187676b8-1029-4153-9da5-6614e9b7892e\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.352844 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-serving-cert\") pod \"187676b8-1029-4153-9da5-6614e9b7892e\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.352877 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-trusted-ca-bundle\") pod \"187676b8-1029-4153-9da5-6614e9b7892e\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.352912 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-oauth-serving-cert\") pod \"187676b8-1029-4153-9da5-6614e9b7892e\" (UID: \"187676b8-1029-4153-9da5-6614e9b7892e\") " Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.353691 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "187676b8-1029-4153-9da5-6614e9b7892e" (UID: "187676b8-1029-4153-9da5-6614e9b7892e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.353706 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-service-ca" (OuterVolumeSpecName: "service-ca") pod "187676b8-1029-4153-9da5-6614e9b7892e" (UID: "187676b8-1029-4153-9da5-6614e9b7892e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.353735 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "187676b8-1029-4153-9da5-6614e9b7892e" (UID: "187676b8-1029-4153-9da5-6614e9b7892e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.353748 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-console-config" (OuterVolumeSpecName: "console-config") pod "187676b8-1029-4153-9da5-6614e9b7892e" (UID: "187676b8-1029-4153-9da5-6614e9b7892e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.358678 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "187676b8-1029-4153-9da5-6614e9b7892e" (UID: "187676b8-1029-4153-9da5-6614e9b7892e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.358898 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/187676b8-1029-4153-9da5-6614e9b7892e-kube-api-access-hqdj9" (OuterVolumeSpecName: "kube-api-access-hqdj9") pod "187676b8-1029-4153-9da5-6614e9b7892e" (UID: "187676b8-1029-4153-9da5-6614e9b7892e"). InnerVolumeSpecName "kube-api-access-hqdj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.358974 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "187676b8-1029-4153-9da5-6614e9b7892e" (UID: "187676b8-1029-4153-9da5-6614e9b7892e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.454183 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqdj9\" (UniqueName: \"kubernetes.io/projected/187676b8-1029-4153-9da5-6614e9b7892e-kube-api-access-hqdj9\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.454221 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.454232 4722 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-console-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.454242 4722 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.454252 4722 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/187676b8-1029-4153-9da5-6614e9b7892e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.454261 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.454273 4722 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/187676b8-1029-4153-9da5-6614e9b7892e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.882204 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-txlzt_187676b8-1029-4153-9da5-6614e9b7892e/console/0.log" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.882246 4722 generic.go:334] "Generic (PLEG): container finished" podID="187676b8-1029-4153-9da5-6614e9b7892e" containerID="25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af" exitCode=2 Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.882304 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-txlzt" event={"ID":"187676b8-1029-4153-9da5-6614e9b7892e","Type":"ContainerDied","Data":"25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af"} Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.882329 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-txlzt" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.882375 4722 scope.go:117] "RemoveContainer" containerID="25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.882364 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-txlzt" event={"ID":"187676b8-1029-4153-9da5-6614e9b7892e","Type":"ContainerDied","Data":"fdc6fea50eb108128f2352057c1c724769c297d884371189de3a59a1b99e73b3"} Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.907366 4722 scope.go:117] "RemoveContainer" containerID="25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af" Feb 19 19:33:02 crc kubenswrapper[4722]: E0219 19:33:02.907930 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af\": container with ID starting with 25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af not found: ID does not exist" containerID="25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.907999 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af"} err="failed to get container status \"25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af\": rpc error: code = NotFound desc = could not find container \"25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af\": container with ID starting with 25a3ecefb9646039c15ec8ff24abdab1b2930b60c238ae51c0241920a2fc33af not found: ID does not exist" Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.924795 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-txlzt"] Feb 19 19:33:02 crc kubenswrapper[4722]: I0219 19:33:02.936241 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-txlzt"] Feb 19 19:33:03 crc kubenswrapper[4722]: I0219 19:33:03.081100 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="187676b8-1029-4153-9da5-6614e9b7892e" path="/var/lib/kubelet/pods/187676b8-1029-4153-9da5-6614e9b7892e/volumes" Feb 19 19:33:03 crc kubenswrapper[4722]: I0219 19:33:03.895048 4722 generic.go:334] "Generic (PLEG): container finished" podID="23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" containerID="35b389162c760bf1c3e490310072541d2133d47a3ddc8b348105d9db972ad459" exitCode=0 Feb 19 19:33:03 crc kubenswrapper[4722]: I0219 19:33:03.895243 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" event={"ID":"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72","Type":"ContainerDied","Data":"35b389162c760bf1c3e490310072541d2133d47a3ddc8b348105d9db972ad459"} Feb 19 19:33:04 crc kubenswrapper[4722]: I0219 19:33:04.911410 4722 generic.go:334] "Generic (PLEG): container finished" podID="23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" containerID="fbba620dd77635b9ebc6fb1d20fb41377dd7eda8493d11165e9dd8721dd04bc9" exitCode=0 Feb 19 19:33:04 crc kubenswrapper[4722]: I0219 19:33:04.911484 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" event={"ID":"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72","Type":"ContainerDied","Data":"fbba620dd77635b9ebc6fb1d20fb41377dd7eda8493d11165e9dd8721dd04bc9"} Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.188269 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.301361 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj62k\" (UniqueName: \"kubernetes.io/projected/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-kube-api-access-pj62k\") pod \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.301403 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-util\") pod \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.301465 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-bundle\") pod \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\" (UID: \"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72\") " Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.302636 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-bundle" (OuterVolumeSpecName: "bundle") pod "23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" (UID: "23190f3b-c7a4-4368-ab62-9d5cbd8ddf72"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.309307 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-kube-api-access-pj62k" (OuterVolumeSpecName: "kube-api-access-pj62k") pod "23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" (UID: "23190f3b-c7a4-4368-ab62-9d5cbd8ddf72"). InnerVolumeSpecName "kube-api-access-pj62k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.402349 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj62k\" (UniqueName: \"kubernetes.io/projected/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-kube-api-access-pj62k\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.402384 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.657454 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-util" (OuterVolumeSpecName: "util") pod "23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" (UID: "23190f3b-c7a4-4368-ab62-9d5cbd8ddf72"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.706346 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/23190f3b-c7a4-4368-ab62-9d5cbd8ddf72-util\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.929471 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" event={"ID":"23190f3b-c7a4-4368-ab62-9d5cbd8ddf72","Type":"ContainerDied","Data":"e509223e66336b2f8aea7b9aeca56637052efbf0782b27bea70243047cae786c"} Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.929538 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb" Feb 19 19:33:06 crc kubenswrapper[4722]: I0219 19:33:06.929542 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e509223e66336b2f8aea7b9aeca56637052efbf0782b27bea70243047cae786c" Feb 19 19:33:08 crc kubenswrapper[4722]: I0219 19:33:08.170920 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:33:08 crc kubenswrapper[4722]: I0219 19:33:08.170972 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:33:08 crc kubenswrapper[4722]: I0219 19:33:08.213054 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:33:09 crc kubenswrapper[4722]: I0219 19:33:09.019073 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:33:10 crc kubenswrapper[4722]: I0219 19:33:10.217110 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5t95p"] Feb 19 19:33:10 crc kubenswrapper[4722]: I0219 19:33:10.955061 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5t95p" podUID="9e454faa-cee8-4571-88ed-88bb048abe32" containerName="registry-server" containerID="cri-o://78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b" gracePeriod=2 Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.339118 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.378754 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c65pq\" (UniqueName: \"kubernetes.io/projected/9e454faa-cee8-4571-88ed-88bb048abe32-kube-api-access-c65pq\") pod \"9e454faa-cee8-4571-88ed-88bb048abe32\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.378962 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-utilities\") pod \"9e454faa-cee8-4571-88ed-88bb048abe32\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.383422 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e454faa-cee8-4571-88ed-88bb048abe32-kube-api-access-c65pq" (OuterVolumeSpecName: "kube-api-access-c65pq") pod "9e454faa-cee8-4571-88ed-88bb048abe32" (UID: "9e454faa-cee8-4571-88ed-88bb048abe32"). InnerVolumeSpecName "kube-api-access-c65pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.390021 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-utilities" (OuterVolumeSpecName: "utilities") pod "9e454faa-cee8-4571-88ed-88bb048abe32" (UID: "9e454faa-cee8-4571-88ed-88bb048abe32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.480428 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-catalog-content\") pod \"9e454faa-cee8-4571-88ed-88bb048abe32\" (UID: \"9e454faa-cee8-4571-88ed-88bb048abe32\") " Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.480868 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c65pq\" (UniqueName: \"kubernetes.io/projected/9e454faa-cee8-4571-88ed-88bb048abe32-kube-api-access-c65pq\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.480901 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.530239 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e454faa-cee8-4571-88ed-88bb048abe32" (UID: "9e454faa-cee8-4571-88ed-88bb048abe32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.582187 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e454faa-cee8-4571-88ed-88bb048abe32-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.963420 4722 generic.go:334] "Generic (PLEG): container finished" podID="9e454faa-cee8-4571-88ed-88bb048abe32" containerID="78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b" exitCode=0 Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.963478 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t95p" event={"ID":"9e454faa-cee8-4571-88ed-88bb048abe32","Type":"ContainerDied","Data":"78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b"} Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.963515 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5t95p" event={"ID":"9e454faa-cee8-4571-88ed-88bb048abe32","Type":"ContainerDied","Data":"139202899fc571b300add407645fb64230d610b9266e7028671d6d1cc0159fda"} Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.963541 4722 scope.go:117] "RemoveContainer" containerID="78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b" Feb 19 19:33:11 crc kubenswrapper[4722]: I0219 19:33:11.963705 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5t95p" Feb 19 19:33:12 crc kubenswrapper[4722]: I0219 19:33:12.005825 4722 scope.go:117] "RemoveContainer" containerID="a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191" Feb 19 19:33:12 crc kubenswrapper[4722]: I0219 19:33:12.019111 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5t95p"] Feb 19 19:33:12 crc kubenswrapper[4722]: I0219 19:33:12.024892 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5t95p"] Feb 19 19:33:12 crc kubenswrapper[4722]: I0219 19:33:12.029931 4722 scope.go:117] "RemoveContainer" containerID="c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2" Feb 19 19:33:12 crc kubenswrapper[4722]: I0219 19:33:12.052130 4722 scope.go:117] "RemoveContainer" containerID="78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b" Feb 19 19:33:12 crc kubenswrapper[4722]: E0219 19:33:12.052602 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b\": container with ID starting with 78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b not found: ID does not exist" containerID="78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b" Feb 19 19:33:12 crc kubenswrapper[4722]: I0219 19:33:12.052642 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b"} err="failed to get container status \"78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b\": rpc error: code = NotFound desc = could not find container \"78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b\": container with ID starting with 78e840fe74e6d5cb75f08aafdbcbfd2d86d750de67acd49e6941b2897b12d55b not found: ID does not exist" Feb 19 19:33:12 crc kubenswrapper[4722]: I0219 19:33:12.052665 4722 scope.go:117] "RemoveContainer" containerID="a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191" Feb 19 19:33:12 crc kubenswrapper[4722]: E0219 19:33:12.052982 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191\": container with ID starting with a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191 not found: ID does not exist" containerID="a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191" Feb 19 19:33:12 crc kubenswrapper[4722]: I0219 19:33:12.053002 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191"} err="failed to get container status \"a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191\": rpc error: code = NotFound desc = could not find container \"a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191\": container with ID starting with a2344170cb31e8f83443def974a8dee539ffe0a2d70574bdfa92361702df7191 not found: ID does not exist" Feb 19 19:33:12 crc kubenswrapper[4722]: I0219 19:33:12.053015 4722 scope.go:117] "RemoveContainer" containerID="c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2" Feb 19 19:33:12 crc kubenswrapper[4722]: E0219 19:33:12.053309 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2\": container with ID starting with c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2 not found: ID does not exist" containerID="c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2" Feb 19 19:33:12 crc kubenswrapper[4722]: I0219 19:33:12.053328 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2"} err="failed to get container status \"c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2\": rpc error: code = NotFound desc = could not find container \"c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2\": container with ID starting with c81fc97344196c754ca658b1b2d392c9a69975da412c0dce6303fa7ae41a91c2 not found: ID does not exist" Feb 19 19:33:13 crc kubenswrapper[4722]: I0219 19:33:13.077904 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e454faa-cee8-4571-88ed-88bb048abe32" path="/var/lib/kubelet/pods/9e454faa-cee8-4571-88ed-88bb048abe32/volumes" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.386570 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx"] Feb 19 19:33:14 crc kubenswrapper[4722]: E0219 19:33:14.386855 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e454faa-cee8-4571-88ed-88bb048abe32" containerName="extract-content" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.386873 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e454faa-cee8-4571-88ed-88bb048abe32" containerName="extract-content" Feb 19 19:33:14 crc kubenswrapper[4722]: E0219 19:33:14.386891 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" containerName="pull" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.386899 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" containerName="pull" Feb 19 19:33:14 crc kubenswrapper[4722]: E0219 19:33:14.386910 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="187676b8-1029-4153-9da5-6614e9b7892e" containerName="console" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.386918 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="187676b8-1029-4153-9da5-6614e9b7892e" containerName="console" Feb 19 19:33:14 crc kubenswrapper[4722]: E0219 19:33:14.386934 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e454faa-cee8-4571-88ed-88bb048abe32" containerName="registry-server" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.386943 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e454faa-cee8-4571-88ed-88bb048abe32" containerName="registry-server" Feb 19 19:33:14 crc kubenswrapper[4722]: E0219 19:33:14.386961 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e454faa-cee8-4571-88ed-88bb048abe32" containerName="extract-utilities" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.386970 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e454faa-cee8-4571-88ed-88bb048abe32" containerName="extract-utilities" Feb 19 19:33:14 crc kubenswrapper[4722]: E0219 19:33:14.386984 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" containerName="util" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.386991 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" containerName="util" Feb 19 19:33:14 crc kubenswrapper[4722]: E0219 19:33:14.387002 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" containerName="extract" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.387010 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" containerName="extract" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.387130 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="23190f3b-c7a4-4368-ab62-9d5cbd8ddf72" containerName="extract" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.387145 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e454faa-cee8-4571-88ed-88bb048abe32" containerName="registry-server" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.387182 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="187676b8-1029-4153-9da5-6614e9b7892e" containerName="console" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.387690 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.390104 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.390757 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.392791 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-ntbjt" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.393315 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.399983 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.433421 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx"] Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.518925 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f41ca32e-24fc-427a-a2bc-76e4d5abba0f-webhook-cert\") pod \"metallb-operator-controller-manager-84788dc4db-d5shx\" (UID: \"f41ca32e-24fc-427a-a2bc-76e4d5abba0f\") " pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.518996 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr4kk\" (UniqueName: \"kubernetes.io/projected/f41ca32e-24fc-427a-a2bc-76e4d5abba0f-kube-api-access-lr4kk\") pod \"metallb-operator-controller-manager-84788dc4db-d5shx\" (UID: \"f41ca32e-24fc-427a-a2bc-76e4d5abba0f\") " pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.519087 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f41ca32e-24fc-427a-a2bc-76e4d5abba0f-apiservice-cert\") pod \"metallb-operator-controller-manager-84788dc4db-d5shx\" (UID: \"f41ca32e-24fc-427a-a2bc-76e4d5abba0f\") " pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.619920 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr4kk\" (UniqueName: \"kubernetes.io/projected/f41ca32e-24fc-427a-a2bc-76e4d5abba0f-kube-api-access-lr4kk\") pod \"metallb-operator-controller-manager-84788dc4db-d5shx\" (UID: \"f41ca32e-24fc-427a-a2bc-76e4d5abba0f\") " pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.620259 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f41ca32e-24fc-427a-a2bc-76e4d5abba0f-apiservice-cert\") pod \"metallb-operator-controller-manager-84788dc4db-d5shx\" (UID: \"f41ca32e-24fc-427a-a2bc-76e4d5abba0f\") " pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.620384 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f41ca32e-24fc-427a-a2bc-76e4d5abba0f-webhook-cert\") pod \"metallb-operator-controller-manager-84788dc4db-d5shx\" (UID: \"f41ca32e-24fc-427a-a2bc-76e4d5abba0f\") " pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.626251 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f41ca32e-24fc-427a-a2bc-76e4d5abba0f-webhook-cert\") pod \"metallb-operator-controller-manager-84788dc4db-d5shx\" (UID: \"f41ca32e-24fc-427a-a2bc-76e4d5abba0f\") " pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.626266 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f41ca32e-24fc-427a-a2bc-76e4d5abba0f-apiservice-cert\") pod \"metallb-operator-controller-manager-84788dc4db-d5shx\" (UID: \"f41ca32e-24fc-427a-a2bc-76e4d5abba0f\") " pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.642921 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr4kk\" (UniqueName: \"kubernetes.io/projected/f41ca32e-24fc-427a-a2bc-76e4d5abba0f-kube-api-access-lr4kk\") pod \"metallb-operator-controller-manager-84788dc4db-d5shx\" (UID: \"f41ca32e-24fc-427a-a2bc-76e4d5abba0f\") " pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.702769 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.723210 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2"] Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.724069 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.725755 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.725930 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.727141 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-9rxqc" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.746833 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2"] Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.829844 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s7c6\" (UniqueName: \"kubernetes.io/projected/02eda63c-5131-407e-bb2e-7ad0adf0e985-kube-api-access-4s7c6\") pod \"metallb-operator-webhook-server-78b8d96b76-5d9t2\" (UID: \"02eda63c-5131-407e-bb2e-7ad0adf0e985\") " pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.829922 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/02eda63c-5131-407e-bb2e-7ad0adf0e985-webhook-cert\") pod \"metallb-operator-webhook-server-78b8d96b76-5d9t2\" (UID: \"02eda63c-5131-407e-bb2e-7ad0adf0e985\") " pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.829960 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/02eda63c-5131-407e-bb2e-7ad0adf0e985-apiservice-cert\") pod \"metallb-operator-webhook-server-78b8d96b76-5d9t2\" (UID: \"02eda63c-5131-407e-bb2e-7ad0adf0e985\") " pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.931078 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/02eda63c-5131-407e-bb2e-7ad0adf0e985-webhook-cert\") pod \"metallb-operator-webhook-server-78b8d96b76-5d9t2\" (UID: \"02eda63c-5131-407e-bb2e-7ad0adf0e985\") " pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.931457 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/02eda63c-5131-407e-bb2e-7ad0adf0e985-apiservice-cert\") pod \"metallb-operator-webhook-server-78b8d96b76-5d9t2\" (UID: \"02eda63c-5131-407e-bb2e-7ad0adf0e985\") " pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.931528 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s7c6\" (UniqueName: \"kubernetes.io/projected/02eda63c-5131-407e-bb2e-7ad0adf0e985-kube-api-access-4s7c6\") pod \"metallb-operator-webhook-server-78b8d96b76-5d9t2\" (UID: \"02eda63c-5131-407e-bb2e-7ad0adf0e985\") " pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.936675 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/02eda63c-5131-407e-bb2e-7ad0adf0e985-webhook-cert\") pod \"metallb-operator-webhook-server-78b8d96b76-5d9t2\" (UID: \"02eda63c-5131-407e-bb2e-7ad0adf0e985\") " pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.936686 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/02eda63c-5131-407e-bb2e-7ad0adf0e985-apiservice-cert\") pod \"metallb-operator-webhook-server-78b8d96b76-5d9t2\" (UID: \"02eda63c-5131-407e-bb2e-7ad0adf0e985\") " pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:14 crc kubenswrapper[4722]: I0219 19:33:14.949061 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s7c6\" (UniqueName: \"kubernetes.io/projected/02eda63c-5131-407e-bb2e-7ad0adf0e985-kube-api-access-4s7c6\") pod \"metallb-operator-webhook-server-78b8d96b76-5d9t2\" (UID: \"02eda63c-5131-407e-bb2e-7ad0adf0e985\") " pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:15 crc kubenswrapper[4722]: I0219 19:33:15.080126 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:15 crc kubenswrapper[4722]: I0219 19:33:15.176783 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx"] Feb 19 19:33:15 crc kubenswrapper[4722]: I0219 19:33:15.487396 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2"] Feb 19 19:33:15 crc kubenswrapper[4722]: W0219 19:33:15.492946 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02eda63c_5131_407e_bb2e_7ad0adf0e985.slice/crio-6188245bed74d3a633d6e7b4b0d1a1e71db2bedd990d704e3fe1784f144b2661 WatchSource:0}: Error finding container 6188245bed74d3a633d6e7b4b0d1a1e71db2bedd990d704e3fe1784f144b2661: Status 404 returned error can't find the container with id 6188245bed74d3a633d6e7b4b0d1a1e71db2bedd990d704e3fe1784f144b2661 Feb 19 19:33:15 crc kubenswrapper[4722]: I0219 19:33:15.989952 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" event={"ID":"f41ca32e-24fc-427a-a2bc-76e4d5abba0f","Type":"ContainerStarted","Data":"7321c210129aad1c086d6637f960184aa78d4127678701faa27f6affe6443088"} Feb 19 19:33:15 crc kubenswrapper[4722]: I0219 19:33:15.991452 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" event={"ID":"02eda63c-5131-407e-bb2e-7ad0adf0e985","Type":"ContainerStarted","Data":"6188245bed74d3a633d6e7b4b0d1a1e71db2bedd990d704e3fe1784f144b2661"} Feb 19 19:33:20 crc kubenswrapper[4722]: I0219 19:33:20.040663 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" event={"ID":"f41ca32e-24fc-427a-a2bc-76e4d5abba0f","Type":"ContainerStarted","Data":"fc23e83eb336b28c2e17c393241d9a071d489ddd2e45d48654dee175777e1d11"} Feb 19 19:33:20 crc kubenswrapper[4722]: I0219 19:33:20.043063 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:20 crc kubenswrapper[4722]: I0219 19:33:20.077421 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" podStartSLOduration=2.076305992 podStartE2EDuration="6.077404282s" podCreationTimestamp="2026-02-19 19:33:14 +0000 UTC" firstStartedPulling="2026-02-19 19:33:15.197790687 +0000 UTC m=+894.810141011" lastFinishedPulling="2026-02-19 19:33:19.198888977 +0000 UTC m=+898.811239301" observedRunningTime="2026-02-19 19:33:20.075736679 +0000 UTC m=+899.688087013" watchObservedRunningTime="2026-02-19 19:33:20.077404282 +0000 UTC m=+899.689754606" Feb 19 19:33:22 crc kubenswrapper[4722]: I0219 19:33:22.056040 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" event={"ID":"02eda63c-5131-407e-bb2e-7ad0adf0e985","Type":"ContainerStarted","Data":"913eab5307a521c0eddf7939b7a2e5bf07dc152e28955b9917626731e0702e53"} Feb 19 19:33:22 crc kubenswrapper[4722]: I0219 19:33:22.056401 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:22 crc kubenswrapper[4722]: I0219 19:33:22.078747 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" podStartSLOduration=2.536726962 podStartE2EDuration="8.078730076s" podCreationTimestamp="2026-02-19 19:33:14 +0000 UTC" firstStartedPulling="2026-02-19 19:33:15.496491017 +0000 UTC m=+895.108841341" lastFinishedPulling="2026-02-19 19:33:21.038494131 +0000 UTC m=+900.650844455" observedRunningTime="2026-02-19 19:33:22.075040001 +0000 UTC m=+901.687390325" watchObservedRunningTime="2026-02-19 19:33:22.078730076 +0000 UTC m=+901.691080400" Feb 19 19:33:35 crc kubenswrapper[4722]: I0219 19:33:35.084309 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-78b8d96b76-5d9t2" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.013775 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8l56l"] Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.015213 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.028333 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8l56l"] Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.031086 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-catalog-content\") pod \"redhat-marketplace-8l56l\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.031489 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-utilities\") pod \"redhat-marketplace-8l56l\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.031514 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6rz6\" (UniqueName: \"kubernetes.io/projected/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-kube-api-access-j6rz6\") pod \"redhat-marketplace-8l56l\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.131969 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-catalog-content\") pod \"redhat-marketplace-8l56l\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.132034 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-utilities\") pod \"redhat-marketplace-8l56l\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.132060 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6rz6\" (UniqueName: \"kubernetes.io/projected/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-kube-api-access-j6rz6\") pod \"redhat-marketplace-8l56l\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.132485 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-utilities\") pod \"redhat-marketplace-8l56l\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.132569 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-catalog-content\") pod \"redhat-marketplace-8l56l\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.150686 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6rz6\" (UniqueName: \"kubernetes.io/projected/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-kube-api-access-j6rz6\") pod \"redhat-marketplace-8l56l\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.329927 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:49 crc kubenswrapper[4722]: I0219 19:33:49.803426 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8l56l"] Feb 19 19:33:49 crc kubenswrapper[4722]: W0219 19:33:49.807092 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fdf0909_68d5_47d0_a7db_fb4b0badbb9e.slice/crio-da7d88763ac29dfb7eb6a77b0d8c30b03e73fe083e5a4a978094f7842ce17c83 WatchSource:0}: Error finding container da7d88763ac29dfb7eb6a77b0d8c30b03e73fe083e5a4a978094f7842ce17c83: Status 404 returned error can't find the container with id da7d88763ac29dfb7eb6a77b0d8c30b03e73fe083e5a4a978094f7842ce17c83 Feb 19 19:33:50 crc kubenswrapper[4722]: I0219 19:33:50.539824 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8l56l" event={"ID":"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e","Type":"ContainerStarted","Data":"da7d88763ac29dfb7eb6a77b0d8c30b03e73fe083e5a4a978094f7842ce17c83"} Feb 19 19:33:51 crc kubenswrapper[4722]: I0219 19:33:51.547627 4722 generic.go:334] "Generic (PLEG): container finished" podID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerID="af1a7852e6e36db3bfdab60b2f38a0b981bc6b497a8bfeafd40f05127561893a" exitCode=0 Feb 19 19:33:51 crc kubenswrapper[4722]: I0219 19:33:51.547674 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8l56l" event={"ID":"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e","Type":"ContainerDied","Data":"af1a7852e6e36db3bfdab60b2f38a0b981bc6b497a8bfeafd40f05127561893a"} Feb 19 19:33:52 crc kubenswrapper[4722]: I0219 19:33:52.557453 4722 generic.go:334] "Generic (PLEG): container finished" podID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerID="af96eb5f1e4c5f1724573bceb2f73b63f2d39809a28f76d211167409be3a723e" exitCode=0 Feb 19 19:33:52 crc kubenswrapper[4722]: I0219 19:33:52.557511 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8l56l" event={"ID":"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e","Type":"ContainerDied","Data":"af96eb5f1e4c5f1724573bceb2f73b63f2d39809a28f76d211167409be3a723e"} Feb 19 19:33:53 crc kubenswrapper[4722]: I0219 19:33:53.568075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8l56l" event={"ID":"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e","Type":"ContainerStarted","Data":"12ce3eb4afea4d6ace6faf3e818b5c4cd3e397c6e02f7f01b42f6d150d8e5597"} Feb 19 19:33:53 crc kubenswrapper[4722]: I0219 19:33:53.591235 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8l56l" podStartSLOduration=4.171619009 podStartE2EDuration="5.59120538s" podCreationTimestamp="2026-02-19 19:33:48 +0000 UTC" firstStartedPulling="2026-02-19 19:33:51.55022259 +0000 UTC m=+931.162572934" lastFinishedPulling="2026-02-19 19:33:52.969808971 +0000 UTC m=+932.582159305" observedRunningTime="2026-02-19 19:33:53.588899209 +0000 UTC m=+933.201249543" watchObservedRunningTime="2026-02-19 19:33:53.59120538 +0000 UTC m=+933.203555754" Feb 19 19:33:54 crc kubenswrapper[4722]: I0219 19:33:54.704796 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-84788dc4db-d5shx" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.484048 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q"] Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.485439 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.487087 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-krb58" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.488071 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.500690 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-92pkj"] Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.504013 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.507042 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.507069 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.509742 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q"] Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.599330 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-nnmrq"] Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.601333 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.603575 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-pr7mk" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.604268 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.604357 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.604901 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.613859 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt27g\" (UniqueName: \"kubernetes.io/projected/505e06e7-65a2-4444-8552-8b96253c87fc-kube-api-access-pt27g\") pod \"frr-k8s-webhook-server-78b44bf5bb-8nh6q\" (UID: \"505e06e7-65a2-4444-8552-8b96253c87fc\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.613949 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-frr-sockets\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.613979 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bg8l\" (UniqueName: \"kubernetes.io/projected/4f25f2fe-8438-431d-9e9d-9efba0109efd-kube-api-access-5bg8l\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.614021 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-metrics\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.614047 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4f25f2fe-8438-431d-9e9d-9efba0109efd-frr-startup\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.614063 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f25f2fe-8438-431d-9e9d-9efba0109efd-metrics-certs\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.614085 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-frr-conf\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.614120 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/505e06e7-65a2-4444-8552-8b96253c87fc-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-8nh6q\" (UID: \"505e06e7-65a2-4444-8552-8b96253c87fc\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.614165 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-reloader\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.614306 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-h9kn7"] Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.621437 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.626108 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.634238 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-h9kn7"] Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717145 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/505e06e7-65a2-4444-8552-8b96253c87fc-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-8nh6q\" (UID: \"505e06e7-65a2-4444-8552-8b96253c87fc\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717218 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-reloader\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717261 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt27g\" (UniqueName: \"kubernetes.io/projected/505e06e7-65a2-4444-8552-8b96253c87fc-kube-api-access-pt27g\") pod \"frr-k8s-webhook-server-78b44bf5bb-8nh6q\" (UID: \"505e06e7-65a2-4444-8552-8b96253c87fc\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717301 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-memberlist\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717339 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-frr-sockets\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717365 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bg8l\" (UniqueName: \"kubernetes.io/projected/4f25f2fe-8438-431d-9e9d-9efba0109efd-kube-api-access-5bg8l\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717423 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d1319426-40ee-40fc-86bf-64cca26d6860-metallb-excludel2\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717459 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f5r5\" (UniqueName: \"kubernetes.io/projected/d1319426-40ee-40fc-86bf-64cca26d6860-kube-api-access-9f5r5\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717481 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-metrics\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717508 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4f25f2fe-8438-431d-9e9d-9efba0109efd-frr-startup\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717528 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f25f2fe-8438-431d-9e9d-9efba0109efd-metrics-certs\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717554 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-frr-conf\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.717587 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-metrics-certs\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.718029 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-reloader\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.718688 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-frr-sockets\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.719113 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4f25f2fe-8438-431d-9e9d-9efba0109efd-frr-startup\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.719368 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-metrics\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.721304 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4f25f2fe-8438-431d-9e9d-9efba0109efd-frr-conf\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.724616 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f25f2fe-8438-431d-9e9d-9efba0109efd-metrics-certs\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.729568 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/505e06e7-65a2-4444-8552-8b96253c87fc-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-8nh6q\" (UID: \"505e06e7-65a2-4444-8552-8b96253c87fc\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.735825 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bg8l\" (UniqueName: \"kubernetes.io/projected/4f25f2fe-8438-431d-9e9d-9efba0109efd-kube-api-access-5bg8l\") pod \"frr-k8s-92pkj\" (UID: \"4f25f2fe-8438-431d-9e9d-9efba0109efd\") " pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.745848 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt27g\" (UniqueName: \"kubernetes.io/projected/505e06e7-65a2-4444-8552-8b96253c87fc-kube-api-access-pt27g\") pod \"frr-k8s-webhook-server-78b44bf5bb-8nh6q\" (UID: \"505e06e7-65a2-4444-8552-8b96253c87fc\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.801833 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.818469 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f5r5\" (UniqueName: \"kubernetes.io/projected/d1319426-40ee-40fc-86bf-64cca26d6860-kube-api-access-9f5r5\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.818557 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr5jn\" (UniqueName: \"kubernetes.io/projected/1a80711d-831e-42ab-a5f8-6272eba9c635-kube-api-access-tr5jn\") pod \"controller-69bbfbf88f-h9kn7\" (UID: \"1a80711d-831e-42ab-a5f8-6272eba9c635\") " pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.818660 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-metrics-certs\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.819219 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a80711d-831e-42ab-a5f8-6272eba9c635-metrics-certs\") pod \"controller-69bbfbf88f-h9kn7\" (UID: \"1a80711d-831e-42ab-a5f8-6272eba9c635\") " pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.819283 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a80711d-831e-42ab-a5f8-6272eba9c635-cert\") pod \"controller-69bbfbf88f-h9kn7\" (UID: \"1a80711d-831e-42ab-a5f8-6272eba9c635\") " pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.819314 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-memberlist\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: E0219 19:33:55.819371 4722 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 19 19:33:55 crc kubenswrapper[4722]: E0219 19:33:55.819419 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-memberlist podName:d1319426-40ee-40fc-86bf-64cca26d6860 nodeName:}" failed. No retries permitted until 2026-02-19 19:33:56.319402867 +0000 UTC m=+935.931753191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-memberlist") pod "speaker-nnmrq" (UID: "d1319426-40ee-40fc-86bf-64cca26d6860") : secret "metallb-memberlist" not found Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.819483 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d1319426-40ee-40fc-86bf-64cca26d6860-metallb-excludel2\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.820330 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d1319426-40ee-40fc-86bf-64cca26d6860-metallb-excludel2\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.822880 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-metrics-certs\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.824411 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-92pkj" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.838636 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f5r5\" (UniqueName: \"kubernetes.io/projected/d1319426-40ee-40fc-86bf-64cca26d6860-kube-api-access-9f5r5\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.925366 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a80711d-831e-42ab-a5f8-6272eba9c635-metrics-certs\") pod \"controller-69bbfbf88f-h9kn7\" (UID: \"1a80711d-831e-42ab-a5f8-6272eba9c635\") " pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.925680 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a80711d-831e-42ab-a5f8-6272eba9c635-cert\") pod \"controller-69bbfbf88f-h9kn7\" (UID: \"1a80711d-831e-42ab-a5f8-6272eba9c635\") " pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.925745 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr5jn\" (UniqueName: \"kubernetes.io/projected/1a80711d-831e-42ab-a5f8-6272eba9c635-kube-api-access-tr5jn\") pod \"controller-69bbfbf88f-h9kn7\" (UID: \"1a80711d-831e-42ab-a5f8-6272eba9c635\") " pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.928423 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.933608 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a80711d-831e-42ab-a5f8-6272eba9c635-metrics-certs\") pod \"controller-69bbfbf88f-h9kn7\" (UID: \"1a80711d-831e-42ab-a5f8-6272eba9c635\") " pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.939914 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a80711d-831e-42ab-a5f8-6272eba9c635-cert\") pod \"controller-69bbfbf88f-h9kn7\" (UID: \"1a80711d-831e-42ab-a5f8-6272eba9c635\") " pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.940642 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr5jn\" (UniqueName: \"kubernetes.io/projected/1a80711d-831e-42ab-a5f8-6272eba9c635-kube-api-access-tr5jn\") pod \"controller-69bbfbf88f-h9kn7\" (UID: \"1a80711d-831e-42ab-a5f8-6272eba9c635\") " pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:55 crc kubenswrapper[4722]: I0219 19:33:55.944436 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:56 crc kubenswrapper[4722]: I0219 19:33:56.031815 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q"] Feb 19 19:33:56 crc kubenswrapper[4722]: W0219 19:33:56.035767 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod505e06e7_65a2_4444_8552_8b96253c87fc.slice/crio-043f36879536848c7e2210b1f6cd4ba729daefca8062b124129ce43da1e00654 WatchSource:0}: Error finding container 043f36879536848c7e2210b1f6cd4ba729daefca8062b124129ce43da1e00654: Status 404 returned error can't find the container with id 043f36879536848c7e2210b1f6cd4ba729daefca8062b124129ce43da1e00654 Feb 19 19:33:56 crc kubenswrapper[4722]: I0219 19:33:56.179442 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-h9kn7"] Feb 19 19:33:56 crc kubenswrapper[4722]: W0219 19:33:56.185684 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a80711d_831e_42ab_a5f8_6272eba9c635.slice/crio-7ac4501d64ea008f85a46f4a71e898e6c68d656d1469d41ff0b6a7dffdf52d4d WatchSource:0}: Error finding container 7ac4501d64ea008f85a46f4a71e898e6c68d656d1469d41ff0b6a7dffdf52d4d: Status 404 returned error can't find the container with id 7ac4501d64ea008f85a46f4a71e898e6c68d656d1469d41ff0b6a7dffdf52d4d Feb 19 19:33:56 crc kubenswrapper[4722]: I0219 19:33:56.331103 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-memberlist\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:56 crc kubenswrapper[4722]: E0219 19:33:56.331384 4722 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 19 19:33:56 crc kubenswrapper[4722]: E0219 19:33:56.331447 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-memberlist podName:d1319426-40ee-40fc-86bf-64cca26d6860 nodeName:}" failed. No retries permitted until 2026-02-19 19:33:57.331430417 +0000 UTC m=+936.943780741 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-memberlist") pod "speaker-nnmrq" (UID: "d1319426-40ee-40fc-86bf-64cca26d6860") : secret "metallb-memberlist" not found Feb 19 19:33:56 crc kubenswrapper[4722]: I0219 19:33:56.587814 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-92pkj" event={"ID":"4f25f2fe-8438-431d-9e9d-9efba0109efd","Type":"ContainerStarted","Data":"152ab4c38c225521aba2e1acafed36b4d320cc55e57360f92a20c8bf2c2e533c"} Feb 19 19:33:56 crc kubenswrapper[4722]: I0219 19:33:56.589497 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-h9kn7" event={"ID":"1a80711d-831e-42ab-a5f8-6272eba9c635","Type":"ContainerStarted","Data":"998f7c8f2634ab7e8cc0231960f414c17fd751c700d756b06944184f969ef182"} Feb 19 19:33:56 crc kubenswrapper[4722]: I0219 19:33:56.589560 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-h9kn7" event={"ID":"1a80711d-831e-42ab-a5f8-6272eba9c635","Type":"ContainerStarted","Data":"c1d0e87d2acc19f08b4837c6c3fdf31552869b14af1c937474cb535c6a4eee34"} Feb 19 19:33:56 crc kubenswrapper[4722]: I0219 19:33:56.589580 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-h9kn7" event={"ID":"1a80711d-831e-42ab-a5f8-6272eba9c635","Type":"ContainerStarted","Data":"7ac4501d64ea008f85a46f4a71e898e6c68d656d1469d41ff0b6a7dffdf52d4d"} Feb 19 19:33:56 crc kubenswrapper[4722]: I0219 19:33:56.589606 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:33:56 crc kubenswrapper[4722]: I0219 19:33:56.590935 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" event={"ID":"505e06e7-65a2-4444-8552-8b96253c87fc","Type":"ContainerStarted","Data":"043f36879536848c7e2210b1f6cd4ba729daefca8062b124129ce43da1e00654"} Feb 19 19:33:56 crc kubenswrapper[4722]: I0219 19:33:56.605927 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-h9kn7" podStartSLOduration=1.605905243 podStartE2EDuration="1.605905243s" podCreationTimestamp="2026-02-19 19:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:33:56.605906333 +0000 UTC m=+936.218256677" watchObservedRunningTime="2026-02-19 19:33:56.605905243 +0000 UTC m=+936.218255577" Feb 19 19:33:57 crc kubenswrapper[4722]: I0219 19:33:57.345100 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-memberlist\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:57 crc kubenswrapper[4722]: I0219 19:33:57.351143 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d1319426-40ee-40fc-86bf-64cca26d6860-memberlist\") pod \"speaker-nnmrq\" (UID: \"d1319426-40ee-40fc-86bf-64cca26d6860\") " pod="metallb-system/speaker-nnmrq" Feb 19 19:33:57 crc kubenswrapper[4722]: I0219 19:33:57.420462 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-nnmrq" Feb 19 19:33:57 crc kubenswrapper[4722]: I0219 19:33:57.607588 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nnmrq" event={"ID":"d1319426-40ee-40fc-86bf-64cca26d6860","Type":"ContainerStarted","Data":"a489e743abc6ad17e094b0618b96f2aa426d7d5ea65b004f3707264d30afe920"} Feb 19 19:33:58 crc kubenswrapper[4722]: I0219 19:33:58.616198 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nnmrq" event={"ID":"d1319426-40ee-40fc-86bf-64cca26d6860","Type":"ContainerStarted","Data":"6769b8f85e991ab6c87acb8ca3c3bfe3542ebcaf6815969f8eb2cd6aeffa1c1a"} Feb 19 19:33:58 crc kubenswrapper[4722]: I0219 19:33:58.616460 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nnmrq" event={"ID":"d1319426-40ee-40fc-86bf-64cca26d6860","Type":"ContainerStarted","Data":"9ec8844df828a64613f7c9044c992d17068db4f154151cb9e9fa90a7f1057d2c"} Feb 19 19:33:58 crc kubenswrapper[4722]: I0219 19:33:58.616491 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-nnmrq" Feb 19 19:33:58 crc kubenswrapper[4722]: I0219 19:33:58.634041 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-nnmrq" podStartSLOduration=3.634026972 podStartE2EDuration="3.634026972s" podCreationTimestamp="2026-02-19 19:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:33:58.630982938 +0000 UTC m=+938.243333262" watchObservedRunningTime="2026-02-19 19:33:58.634026972 +0000 UTC m=+938.246377296" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.330160 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.330232 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.397210 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.656234 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v7894"] Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.657456 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.672806 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v7894"] Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.715224 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.781469 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-catalog-content\") pod \"certified-operators-v7894\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.781621 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-utilities\") pod \"certified-operators-v7894\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.781643 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqsd9\" (UniqueName: \"kubernetes.io/projected/97a52bce-2539-405e-867d-922857a2ce75-kube-api-access-jqsd9\") pod \"certified-operators-v7894\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.887323 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-utilities\") pod \"certified-operators-v7894\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.887386 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqsd9\" (UniqueName: \"kubernetes.io/projected/97a52bce-2539-405e-867d-922857a2ce75-kube-api-access-jqsd9\") pod \"certified-operators-v7894\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.887431 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-catalog-content\") pod \"certified-operators-v7894\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.887880 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-utilities\") pod \"certified-operators-v7894\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.888612 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-catalog-content\") pod \"certified-operators-v7894\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.923487 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqsd9\" (UniqueName: \"kubernetes.io/projected/97a52bce-2539-405e-867d-922857a2ce75-kube-api-access-jqsd9\") pod \"certified-operators-v7894\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:33:59 crc kubenswrapper[4722]: I0219 19:33:59.978416 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:34:00 crc kubenswrapper[4722]: I0219 19:34:00.459434 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v7894"] Feb 19 19:34:00 crc kubenswrapper[4722]: I0219 19:34:00.633568 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7894" event={"ID":"97a52bce-2539-405e-867d-922857a2ce75","Type":"ContainerStarted","Data":"19f5d48196f47456f69c12a506337347dc985dee8f4eac6265950fed4107d051"} Feb 19 19:34:01 crc kubenswrapper[4722]: I0219 19:34:01.639377 4722 generic.go:334] "Generic (PLEG): container finished" podID="97a52bce-2539-405e-867d-922857a2ce75" containerID="cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230" exitCode=0 Feb 19 19:34:01 crc kubenswrapper[4722]: I0219 19:34:01.639421 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7894" event={"ID":"97a52bce-2539-405e-867d-922857a2ce75","Type":"ContainerDied","Data":"cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230"} Feb 19 19:34:02 crc kubenswrapper[4722]: I0219 19:34:02.040269 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8l56l"] Feb 19 19:34:02 crc kubenswrapper[4722]: I0219 19:34:02.040840 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8l56l" podUID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerName="registry-server" containerID="cri-o://12ce3eb4afea4d6ace6faf3e818b5c4cd3e397c6e02f7f01b42f6d150d8e5597" gracePeriod=2 Feb 19 19:34:02 crc kubenswrapper[4722]: I0219 19:34:02.648020 4722 generic.go:334] "Generic (PLEG): container finished" podID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerID="12ce3eb4afea4d6ace6faf3e818b5c4cd3e397c6e02f7f01b42f6d150d8e5597" exitCode=0 Feb 19 19:34:02 crc kubenswrapper[4722]: I0219 19:34:02.648100 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8l56l" event={"ID":"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e","Type":"ContainerDied","Data":"12ce3eb4afea4d6ace6faf3e818b5c4cd3e397c6e02f7f01b42f6d150d8e5597"} Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.425485 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.466023 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-utilities\") pod \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.466229 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6rz6\" (UniqueName: \"kubernetes.io/projected/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-kube-api-access-j6rz6\") pod \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.466292 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-catalog-content\") pod \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\" (UID: \"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e\") " Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.467531 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-utilities" (OuterVolumeSpecName: "utilities") pod "4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" (UID: "4fdf0909-68d5-47d0-a7db-fb4b0badbb9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.483514 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-kube-api-access-j6rz6" (OuterVolumeSpecName: "kube-api-access-j6rz6") pod "4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" (UID: "4fdf0909-68d5-47d0-a7db-fb4b0badbb9e"). InnerVolumeSpecName "kube-api-access-j6rz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.507394 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" (UID: "4fdf0909-68d5-47d0-a7db-fb4b0badbb9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.567493 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.567558 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.567568 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6rz6\" (UniqueName: \"kubernetes.io/projected/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e-kube-api-access-j6rz6\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.667753 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8l56l" event={"ID":"4fdf0909-68d5-47d0-a7db-fb4b0badbb9e","Type":"ContainerDied","Data":"da7d88763ac29dfb7eb6a77b0d8c30b03e73fe083e5a4a978094f7842ce17c83"} Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.667810 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8l56l" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.668168 4722 scope.go:117] "RemoveContainer" containerID="12ce3eb4afea4d6ace6faf3e818b5c4cd3e397c6e02f7f01b42f6d150d8e5597" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.693532 4722 scope.go:117] "RemoveContainer" containerID="af96eb5f1e4c5f1724573bceb2f73b63f2d39809a28f76d211167409be3a723e" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.727673 4722 scope.go:117] "RemoveContainer" containerID="af1a7852e6e36db3bfdab60b2f38a0b981bc6b497a8bfeafd40f05127561893a" Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.735063 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8l56l"] Feb 19 19:34:05 crc kubenswrapper[4722]: I0219 19:34:05.740273 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8l56l"] Feb 19 19:34:06 crc kubenswrapper[4722]: I0219 19:34:06.691625 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" event={"ID":"505e06e7-65a2-4444-8552-8b96253c87fc","Type":"ContainerStarted","Data":"c96664872083f47d40ce3d34aaf5dc56bde7e0c53f5e2c85d94d1ab7dd6b8ec8"} Feb 19 19:34:06 crc kubenswrapper[4722]: I0219 19:34:06.692274 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" Feb 19 19:34:06 crc kubenswrapper[4722]: I0219 19:34:06.696130 4722 generic.go:334] "Generic (PLEG): container finished" podID="4f25f2fe-8438-431d-9e9d-9efba0109efd" containerID="7ea840970662d7f3f2ae6378d191226722efed6f477f17b743321f3bde30ca52" exitCode=0 Feb 19 19:34:06 crc kubenswrapper[4722]: I0219 19:34:06.696412 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-92pkj" event={"ID":"4f25f2fe-8438-431d-9e9d-9efba0109efd","Type":"ContainerDied","Data":"7ea840970662d7f3f2ae6378d191226722efed6f477f17b743321f3bde30ca52"} Feb 19 19:34:06 crc kubenswrapper[4722]: I0219 19:34:06.703992 4722 generic.go:334] "Generic (PLEG): container finished" podID="97a52bce-2539-405e-867d-922857a2ce75" containerID="aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031" exitCode=0 Feb 19 19:34:06 crc kubenswrapper[4722]: I0219 19:34:06.704071 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7894" event={"ID":"97a52bce-2539-405e-867d-922857a2ce75","Type":"ContainerDied","Data":"aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031"} Feb 19 19:34:06 crc kubenswrapper[4722]: I0219 19:34:06.736827 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" podStartSLOduration=2.2306797879999998 podStartE2EDuration="11.736766727s" podCreationTimestamp="2026-02-19 19:33:55 +0000 UTC" firstStartedPulling="2026-02-19 19:33:56.038054012 +0000 UTC m=+935.650404326" lastFinishedPulling="2026-02-19 19:34:05.544140931 +0000 UTC m=+945.156491265" observedRunningTime="2026-02-19 19:34:06.727704094 +0000 UTC m=+946.340054448" watchObservedRunningTime="2026-02-19 19:34:06.736766727 +0000 UTC m=+946.349117091" Feb 19 19:34:07 crc kubenswrapper[4722]: I0219 19:34:07.079577 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" path="/var/lib/kubelet/pods/4fdf0909-68d5-47d0-a7db-fb4b0badbb9e/volumes" Feb 19 19:34:07 crc kubenswrapper[4722]: I0219 19:34:07.423966 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-nnmrq" Feb 19 19:34:07 crc kubenswrapper[4722]: I0219 19:34:07.711034 4722 generic.go:334] "Generic (PLEG): container finished" podID="4f25f2fe-8438-431d-9e9d-9efba0109efd" containerID="f664664bdbde67ddfaaa146d9801ae6dc0166fb9a4082a014c6c102abbbc4ed2" exitCode=0 Feb 19 19:34:07 crc kubenswrapper[4722]: I0219 19:34:07.711124 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-92pkj" event={"ID":"4f25f2fe-8438-431d-9e9d-9efba0109efd","Type":"ContainerDied","Data":"f664664bdbde67ddfaaa146d9801ae6dc0166fb9a4082a014c6c102abbbc4ed2"} Feb 19 19:34:07 crc kubenswrapper[4722]: I0219 19:34:07.714534 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7894" event={"ID":"97a52bce-2539-405e-867d-922857a2ce75","Type":"ContainerStarted","Data":"be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06"} Feb 19 19:34:07 crc kubenswrapper[4722]: I0219 19:34:07.775585 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v7894" podStartSLOduration=3.292267615 podStartE2EDuration="8.775558668s" podCreationTimestamp="2026-02-19 19:33:59 +0000 UTC" firstStartedPulling="2026-02-19 19:34:01.641123948 +0000 UTC m=+941.253474272" lastFinishedPulling="2026-02-19 19:34:07.124414981 +0000 UTC m=+946.736765325" observedRunningTime="2026-02-19 19:34:07.770184731 +0000 UTC m=+947.382535065" watchObservedRunningTime="2026-02-19 19:34:07.775558668 +0000 UTC m=+947.387908992" Feb 19 19:34:08 crc kubenswrapper[4722]: I0219 19:34:08.727276 4722 generic.go:334] "Generic (PLEG): container finished" podID="4f25f2fe-8438-431d-9e9d-9efba0109efd" containerID="4d890c4dac125bb380d322709200de0a7be37d720848b6c18dd4ea38f5947388" exitCode=0 Feb 19 19:34:08 crc kubenswrapper[4722]: I0219 19:34:08.727328 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-92pkj" event={"ID":"4f25f2fe-8438-431d-9e9d-9efba0109efd","Type":"ContainerDied","Data":"4d890c4dac125bb380d322709200de0a7be37d720848b6c18dd4ea38f5947388"} Feb 19 19:34:09 crc kubenswrapper[4722]: I0219 19:34:09.742398 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-92pkj" event={"ID":"4f25f2fe-8438-431d-9e9d-9efba0109efd","Type":"ContainerStarted","Data":"874b253e0bccb7f97ebd95fcb54c0e31eba020ca616e9fb59196bcf1ac283315"} Feb 19 19:34:09 crc kubenswrapper[4722]: I0219 19:34:09.742818 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-92pkj" event={"ID":"4f25f2fe-8438-431d-9e9d-9efba0109efd","Type":"ContainerStarted","Data":"0b306ce5212c17fdbe1828b59c663f13d6b0eb82442bedd010f3f1d0ae151396"} Feb 19 19:34:09 crc kubenswrapper[4722]: I0219 19:34:09.742837 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-92pkj" event={"ID":"4f25f2fe-8438-431d-9e9d-9efba0109efd","Type":"ContainerStarted","Data":"d4998c2193b90cda193d0b0cf7920519713faa546ce2f1032b76b9e2cbecef31"} Feb 19 19:34:09 crc kubenswrapper[4722]: I0219 19:34:09.742882 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-92pkj" event={"ID":"4f25f2fe-8438-431d-9e9d-9efba0109efd","Type":"ContainerStarted","Data":"bd365dd3a658c77ec8005c17e8f91ba642703406a3c3f1c689db6b563245bb9c"} Feb 19 19:34:09 crc kubenswrapper[4722]: I0219 19:34:09.742897 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-92pkj" event={"ID":"4f25f2fe-8438-431d-9e9d-9efba0109efd","Type":"ContainerStarted","Data":"7e40432bb8f30bfde889a12c17f0af54b3f76fbe90e89b03c3bdebb482bbfc64"} Feb 19 19:34:09 crc kubenswrapper[4722]: I0219 19:34:09.979087 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:34:09 crc kubenswrapper[4722]: I0219 19:34:09.979168 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:34:10 crc kubenswrapper[4722]: I0219 19:34:10.039696 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:34:10 crc kubenswrapper[4722]: I0219 19:34:10.753889 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-92pkj" event={"ID":"4f25f2fe-8438-431d-9e9d-9efba0109efd","Type":"ContainerStarted","Data":"310864aff00bb3126a6253a731db09195b44e5e3c05266c4ffa908556dd4ec81"} Feb 19 19:34:10 crc kubenswrapper[4722]: I0219 19:34:10.781996 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-92pkj" podStartSLOduration=6.2603589809999995 podStartE2EDuration="15.781975554s" podCreationTimestamp="2026-02-19 19:33:55 +0000 UTC" firstStartedPulling="2026-02-19 19:33:55.98665845 +0000 UTC m=+935.599008774" lastFinishedPulling="2026-02-19 19:34:05.508275023 +0000 UTC m=+945.120625347" observedRunningTime="2026-02-19 19:34:10.773957154 +0000 UTC m=+950.386307498" watchObservedRunningTime="2026-02-19 19:34:10.781975554 +0000 UTC m=+950.394325898" Feb 19 19:34:10 crc kubenswrapper[4722]: I0219 19:34:10.825048 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-92pkj" Feb 19 19:34:10 crc kubenswrapper[4722]: I0219 19:34:10.862529 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-92pkj" Feb 19 19:34:11 crc kubenswrapper[4722]: I0219 19:34:11.758991 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-92pkj" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.659585 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-knsfg"] Feb 19 19:34:13 crc kubenswrapper[4722]: E0219 19:34:13.660106 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerName="extract-content" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.660134 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerName="extract-content" Feb 19 19:34:13 crc kubenswrapper[4722]: E0219 19:34:13.660234 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerName="extract-utilities" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.660249 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerName="extract-utilities" Feb 19 19:34:13 crc kubenswrapper[4722]: E0219 19:34:13.660270 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerName="registry-server" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.660283 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerName="registry-server" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.660480 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fdf0909-68d5-47d0-a7db-fb4b0badbb9e" containerName="registry-server" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.661196 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-knsfg" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.664509 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.664777 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-jnrhw" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.664926 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.669384 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-knsfg"] Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.784252 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmd4q\" (UniqueName: \"kubernetes.io/projected/efd426b6-a53d-4127-ae59-e2f9aec632cc-kube-api-access-nmd4q\") pod \"openstack-operator-index-knsfg\" (UID: \"efd426b6-a53d-4127-ae59-e2f9aec632cc\") " pod="openstack-operators/openstack-operator-index-knsfg" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.885908 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmd4q\" (UniqueName: \"kubernetes.io/projected/efd426b6-a53d-4127-ae59-e2f9aec632cc-kube-api-access-nmd4q\") pod \"openstack-operator-index-knsfg\" (UID: \"efd426b6-a53d-4127-ae59-e2f9aec632cc\") " pod="openstack-operators/openstack-operator-index-knsfg" Feb 19 19:34:13 crc kubenswrapper[4722]: I0219 19:34:13.908985 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmd4q\" (UniqueName: \"kubernetes.io/projected/efd426b6-a53d-4127-ae59-e2f9aec632cc-kube-api-access-nmd4q\") pod \"openstack-operator-index-knsfg\" (UID: \"efd426b6-a53d-4127-ae59-e2f9aec632cc\") " pod="openstack-operators/openstack-operator-index-knsfg" Feb 19 19:34:14 crc kubenswrapper[4722]: I0219 19:34:14.021348 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-knsfg" Feb 19 19:34:14 crc kubenswrapper[4722]: I0219 19:34:14.493542 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-knsfg"] Feb 19 19:34:14 crc kubenswrapper[4722]: I0219 19:34:14.784667 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-knsfg" event={"ID":"efd426b6-a53d-4127-ae59-e2f9aec632cc","Type":"ContainerStarted","Data":"82ff5f971bb985c47c54ed6dd388bf0d32c67ddd65ee8d44e0802bf6a19497b9"} Feb 19 19:34:15 crc kubenswrapper[4722]: I0219 19:34:15.807482 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-8nh6q" Feb 19 19:34:15 crc kubenswrapper[4722]: I0219 19:34:15.951875 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-h9kn7" Feb 19 19:34:17 crc kubenswrapper[4722]: I0219 19:34:17.823297 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-knsfg" event={"ID":"efd426b6-a53d-4127-ae59-e2f9aec632cc","Type":"ContainerStarted","Data":"0fb9d395a665f45d2643f4879001487512fbda36d3d30d7d2a1031a259e9a39b"} Feb 19 19:34:17 crc kubenswrapper[4722]: I0219 19:34:17.854680 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-knsfg" podStartSLOduration=2.661166546 podStartE2EDuration="4.85465532s" podCreationTimestamp="2026-02-19 19:34:13 +0000 UTC" firstStartedPulling="2026-02-19 19:34:14.501136306 +0000 UTC m=+954.113486650" lastFinishedPulling="2026-02-19 19:34:16.6946251 +0000 UTC m=+956.306975424" observedRunningTime="2026-02-19 19:34:17.851061918 +0000 UTC m=+957.463412272" watchObservedRunningTime="2026-02-19 19:34:17.85465532 +0000 UTC m=+957.467005684" Feb 19 19:34:20 crc kubenswrapper[4722]: I0219 19:34:20.018579 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:34:23 crc kubenswrapper[4722]: I0219 19:34:23.848142 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v7894"] Feb 19 19:34:23 crc kubenswrapper[4722]: I0219 19:34:23.851093 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v7894" podUID="97a52bce-2539-405e-867d-922857a2ce75" containerName="registry-server" containerID="cri-o://be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06" gracePeriod=2 Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.022249 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-knsfg" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.022670 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-knsfg" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.054669 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-knsfg" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.249722 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.443399 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-catalog-content\") pod \"97a52bce-2539-405e-867d-922857a2ce75\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.443546 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-utilities\") pod \"97a52bce-2539-405e-867d-922857a2ce75\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.443607 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqsd9\" (UniqueName: \"kubernetes.io/projected/97a52bce-2539-405e-867d-922857a2ce75-kube-api-access-jqsd9\") pod \"97a52bce-2539-405e-867d-922857a2ce75\" (UID: \"97a52bce-2539-405e-867d-922857a2ce75\") " Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.444339 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-utilities" (OuterVolumeSpecName: "utilities") pod "97a52bce-2539-405e-867d-922857a2ce75" (UID: "97a52bce-2539-405e-867d-922857a2ce75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.448962 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97a52bce-2539-405e-867d-922857a2ce75-kube-api-access-jqsd9" (OuterVolumeSpecName: "kube-api-access-jqsd9") pod "97a52bce-2539-405e-867d-922857a2ce75" (UID: "97a52bce-2539-405e-867d-922857a2ce75"). InnerVolumeSpecName "kube-api-access-jqsd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.491533 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97a52bce-2539-405e-867d-922857a2ce75" (UID: "97a52bce-2539-405e-867d-922857a2ce75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.545334 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.545386 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqsd9\" (UniqueName: \"kubernetes.io/projected/97a52bce-2539-405e-867d-922857a2ce75-kube-api-access-jqsd9\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.545401 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97a52bce-2539-405e-867d-922857a2ce75-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.871661 4722 generic.go:334] "Generic (PLEG): container finished" podID="97a52bce-2539-405e-867d-922857a2ce75" containerID="be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06" exitCode=0 Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.872006 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v7894" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.871865 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7894" event={"ID":"97a52bce-2539-405e-867d-922857a2ce75","Type":"ContainerDied","Data":"be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06"} Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.872554 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v7894" event={"ID":"97a52bce-2539-405e-867d-922857a2ce75","Type":"ContainerDied","Data":"19f5d48196f47456f69c12a506337347dc985dee8f4eac6265950fed4107d051"} Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.872574 4722 scope.go:117] "RemoveContainer" containerID="be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.891651 4722 scope.go:117] "RemoveContainer" containerID="aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.910080 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-knsfg" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.920165 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v7894"] Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.925389 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v7894"] Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.931477 4722 scope.go:117] "RemoveContainer" containerID="cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.960325 4722 scope.go:117] "RemoveContainer" containerID="be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06" Feb 19 19:34:24 crc kubenswrapper[4722]: E0219 19:34:24.960743 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06\": container with ID starting with be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06 not found: ID does not exist" containerID="be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.960787 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06"} err="failed to get container status \"be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06\": rpc error: code = NotFound desc = could not find container \"be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06\": container with ID starting with be9ead216df05794aa38e6b99f7a78dfd9d221bb064c1d11ed4dbcaa44621c06 not found: ID does not exist" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.960818 4722 scope.go:117] "RemoveContainer" containerID="aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031" Feb 19 19:34:24 crc kubenswrapper[4722]: E0219 19:34:24.961182 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031\": container with ID starting with aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031 not found: ID does not exist" containerID="aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.961221 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031"} err="failed to get container status \"aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031\": rpc error: code = NotFound desc = could not find container \"aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031\": container with ID starting with aca5b422b256c2c49a98f09bd56861196542e5ff2dafedcfdf45a52f46865031 not found: ID does not exist" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.961246 4722 scope.go:117] "RemoveContainer" containerID="cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230" Feb 19 19:34:24 crc kubenswrapper[4722]: E0219 19:34:24.961594 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230\": container with ID starting with cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230 not found: ID does not exist" containerID="cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230" Feb 19 19:34:24 crc kubenswrapper[4722]: I0219 19:34:24.961624 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230"} err="failed to get container status \"cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230\": rpc error: code = NotFound desc = could not find container \"cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230\": container with ID starting with cf68e14d81087e310c74bc316885a57bf6a81d8b6620fdf5cadea3715c41d230 not found: ID does not exist" Feb 19 19:34:25 crc kubenswrapper[4722]: I0219 19:34:25.079440 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97a52bce-2539-405e-867d-922857a2ce75" path="/var/lib/kubelet/pods/97a52bce-2539-405e-867d-922857a2ce75/volumes" Feb 19 19:34:25 crc kubenswrapper[4722]: I0219 19:34:25.828755 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-92pkj" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.487126 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2"] Feb 19 19:34:26 crc kubenswrapper[4722]: E0219 19:34:26.487840 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97a52bce-2539-405e-867d-922857a2ce75" containerName="extract-content" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.487862 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a52bce-2539-405e-867d-922857a2ce75" containerName="extract-content" Feb 19 19:34:26 crc kubenswrapper[4722]: E0219 19:34:26.487905 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97a52bce-2539-405e-867d-922857a2ce75" containerName="registry-server" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.487920 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a52bce-2539-405e-867d-922857a2ce75" containerName="registry-server" Feb 19 19:34:26 crc kubenswrapper[4722]: E0219 19:34:26.487937 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97a52bce-2539-405e-867d-922857a2ce75" containerName="extract-utilities" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.487975 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a52bce-2539-405e-867d-922857a2ce75" containerName="extract-utilities" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.488229 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="97a52bce-2539-405e-867d-922857a2ce75" containerName="registry-server" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.489896 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.491702 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gb525" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.501578 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2"] Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.571897 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-util\") pod \"533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.571998 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-bundle\") pod \"533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.572077 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v47b\" (UniqueName: \"kubernetes.io/projected/4260359d-1333-4ec5-9a57-16e2782fcf0f-kube-api-access-4v47b\") pod \"533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.673310 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-util\") pod \"533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.673390 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-bundle\") pod \"533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.673468 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v47b\" (UniqueName: \"kubernetes.io/projected/4260359d-1333-4ec5-9a57-16e2782fcf0f-kube-api-access-4v47b\") pod \"533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.673775 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-util\") pod \"533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.674026 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-bundle\") pod \"533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.696736 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v47b\" (UniqueName: \"kubernetes.io/projected/4260359d-1333-4ec5-9a57-16e2782fcf0f-kube-api-access-4v47b\") pod \"533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:26 crc kubenswrapper[4722]: I0219 19:34:26.805362 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:27 crc kubenswrapper[4722]: I0219 19:34:27.254284 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2"] Feb 19 19:34:27 crc kubenswrapper[4722]: W0219 19:34:27.256171 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4260359d_1333_4ec5_9a57_16e2782fcf0f.slice/crio-83c3b842b70b10c1e1c61ed83c347be856a84fc970c98fe9cfe8574a92592e4a WatchSource:0}: Error finding container 83c3b842b70b10c1e1c61ed83c347be856a84fc970c98fe9cfe8574a92592e4a: Status 404 returned error can't find the container with id 83c3b842b70b10c1e1c61ed83c347be856a84fc970c98fe9cfe8574a92592e4a Feb 19 19:34:27 crc kubenswrapper[4722]: I0219 19:34:27.893186 4722 generic.go:334] "Generic (PLEG): container finished" podID="4260359d-1333-4ec5-9a57-16e2782fcf0f" containerID="ab7554108563399dcf6667cfbe885a7d27fad38216d3ea33bca15a54ecc72218" exitCode=0 Feb 19 19:34:27 crc kubenswrapper[4722]: I0219 19:34:27.893291 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" event={"ID":"4260359d-1333-4ec5-9a57-16e2782fcf0f","Type":"ContainerDied","Data":"ab7554108563399dcf6667cfbe885a7d27fad38216d3ea33bca15a54ecc72218"} Feb 19 19:34:27 crc kubenswrapper[4722]: I0219 19:34:27.893472 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" event={"ID":"4260359d-1333-4ec5-9a57-16e2782fcf0f","Type":"ContainerStarted","Data":"83c3b842b70b10c1e1c61ed83c347be856a84fc970c98fe9cfe8574a92592e4a"} Feb 19 19:34:30 crc kubenswrapper[4722]: I0219 19:34:30.916514 4722 generic.go:334] "Generic (PLEG): container finished" podID="4260359d-1333-4ec5-9a57-16e2782fcf0f" containerID="0fff30a4a814644ff10c1e4f1b8d8a6bb84fe24437b4e7448f13179de5df38ed" exitCode=0 Feb 19 19:34:30 crc kubenswrapper[4722]: I0219 19:34:30.916579 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" event={"ID":"4260359d-1333-4ec5-9a57-16e2782fcf0f","Type":"ContainerDied","Data":"0fff30a4a814644ff10c1e4f1b8d8a6bb84fe24437b4e7448f13179de5df38ed"} Feb 19 19:34:31 crc kubenswrapper[4722]: I0219 19:34:31.925754 4722 generic.go:334] "Generic (PLEG): container finished" podID="4260359d-1333-4ec5-9a57-16e2782fcf0f" containerID="68a41a01ba9a1fef435bf6f5b6b832084a0ab857191fb46380ee752eaeebab7d" exitCode=0 Feb 19 19:34:31 crc kubenswrapper[4722]: I0219 19:34:31.925857 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" event={"ID":"4260359d-1333-4ec5-9a57-16e2782fcf0f","Type":"ContainerDied","Data":"68a41a01ba9a1fef435bf6f5b6b832084a0ab857191fb46380ee752eaeebab7d"} Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.219130 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.277451 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v47b\" (UniqueName: \"kubernetes.io/projected/4260359d-1333-4ec5-9a57-16e2782fcf0f-kube-api-access-4v47b\") pod \"4260359d-1333-4ec5-9a57-16e2782fcf0f\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.277905 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-util\") pod \"4260359d-1333-4ec5-9a57-16e2782fcf0f\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.277975 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-bundle\") pod \"4260359d-1333-4ec5-9a57-16e2782fcf0f\" (UID: \"4260359d-1333-4ec5-9a57-16e2782fcf0f\") " Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.279821 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-bundle" (OuterVolumeSpecName: "bundle") pod "4260359d-1333-4ec5-9a57-16e2782fcf0f" (UID: "4260359d-1333-4ec5-9a57-16e2782fcf0f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.285515 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4260359d-1333-4ec5-9a57-16e2782fcf0f-kube-api-access-4v47b" (OuterVolumeSpecName: "kube-api-access-4v47b") pod "4260359d-1333-4ec5-9a57-16e2782fcf0f" (UID: "4260359d-1333-4ec5-9a57-16e2782fcf0f"). InnerVolumeSpecName "kube-api-access-4v47b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.291051 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-util" (OuterVolumeSpecName: "util") pod "4260359d-1333-4ec5-9a57-16e2782fcf0f" (UID: "4260359d-1333-4ec5-9a57-16e2782fcf0f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.379357 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-util\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.379396 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4260359d-1333-4ec5-9a57-16e2782fcf0f-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.379405 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v47b\" (UniqueName: \"kubernetes.io/projected/4260359d-1333-4ec5-9a57-16e2782fcf0f-kube-api-access-4v47b\") on node \"crc\" DevicePath \"\"" Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.939382 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" event={"ID":"4260359d-1333-4ec5-9a57-16e2782fcf0f","Type":"ContainerDied","Data":"83c3b842b70b10c1e1c61ed83c347be856a84fc970c98fe9cfe8574a92592e4a"} Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.939418 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83c3b842b70b10c1e1c61ed83c347be856a84fc970c98fe9cfe8574a92592e4a" Feb 19 19:34:33 crc kubenswrapper[4722]: I0219 19:34:33.939437 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.238787 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q"] Feb 19 19:34:38 crc kubenswrapper[4722]: E0219 19:34:38.239380 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4260359d-1333-4ec5-9a57-16e2782fcf0f" containerName="extract" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.239394 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4260359d-1333-4ec5-9a57-16e2782fcf0f" containerName="extract" Feb 19 19:34:38 crc kubenswrapper[4722]: E0219 19:34:38.239408 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4260359d-1333-4ec5-9a57-16e2782fcf0f" containerName="util" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.239415 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4260359d-1333-4ec5-9a57-16e2782fcf0f" containerName="util" Feb 19 19:34:38 crc kubenswrapper[4722]: E0219 19:34:38.239435 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4260359d-1333-4ec5-9a57-16e2782fcf0f" containerName="pull" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.239443 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4260359d-1333-4ec5-9a57-16e2782fcf0f" containerName="pull" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.239567 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4260359d-1333-4ec5-9a57-16e2782fcf0f" containerName="extract" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.240126 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.242541 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-r8q9n" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.270940 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q"] Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.388186 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdmxc\" (UniqueName: \"kubernetes.io/projected/fb86a4c4-379d-4dcd-86c5-5ee95092e6c0-kube-api-access-gdmxc\") pod \"openstack-operator-controller-init-6ddf4746f6-l927q\" (UID: \"fb86a4c4-379d-4dcd-86c5-5ee95092e6c0\") " pod="openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.490111 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdmxc\" (UniqueName: \"kubernetes.io/projected/fb86a4c4-379d-4dcd-86c5-5ee95092e6c0-kube-api-access-gdmxc\") pod \"openstack-operator-controller-init-6ddf4746f6-l927q\" (UID: \"fb86a4c4-379d-4dcd-86c5-5ee95092e6c0\") " pod="openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.528233 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdmxc\" (UniqueName: \"kubernetes.io/projected/fb86a4c4-379d-4dcd-86c5-5ee95092e6c0-kube-api-access-gdmxc\") pod \"openstack-operator-controller-init-6ddf4746f6-l927q\" (UID: \"fb86a4c4-379d-4dcd-86c5-5ee95092e6c0\") " pod="openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.557066 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q" Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.796360 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q"] Feb 19 19:34:38 crc kubenswrapper[4722]: I0219 19:34:38.801576 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:34:39 crc kubenswrapper[4722]: I0219 19:34:39.080366 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q" event={"ID":"fb86a4c4-379d-4dcd-86c5-5ee95092e6c0","Type":"ContainerStarted","Data":"286a488e18fa657ccdd1b9a0961c73953796c8219c1d84e9f385a0d4d335dfc9"} Feb 19 19:34:41 crc kubenswrapper[4722]: I0219 19:34:41.799532 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:34:41 crc kubenswrapper[4722]: I0219 19:34:41.799900 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:34:44 crc kubenswrapper[4722]: I0219 19:34:44.115876 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q" event={"ID":"fb86a4c4-379d-4dcd-86c5-5ee95092e6c0","Type":"ContainerStarted","Data":"040d704b01b3a5be919620b4ae2c981f0305775b07ff217791abe991d1521961"} Feb 19 19:34:44 crc kubenswrapper[4722]: I0219 19:34:44.116269 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q" Feb 19 19:34:48 crc kubenswrapper[4722]: I0219 19:34:48.561037 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q" Feb 19 19:34:48 crc kubenswrapper[4722]: I0219 19:34:48.610663 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6ddf4746f6-l927q" podStartSLOduration=5.8653512249999995 podStartE2EDuration="10.610631043s" podCreationTimestamp="2026-02-19 19:34:38 +0000 UTC" firstStartedPulling="2026-02-19 19:34:38.801216158 +0000 UTC m=+978.413566482" lastFinishedPulling="2026-02-19 19:34:43.546495976 +0000 UTC m=+983.158846300" observedRunningTime="2026-02-19 19:34:44.144883358 +0000 UTC m=+983.757233682" watchObservedRunningTime="2026-02-19 19:34:48.610631043 +0000 UTC m=+988.222981427" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.006621 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.008450 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.011046 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-km9dx" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.011178 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.012042 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.013664 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-qtmnj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.033693 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.042006 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.065478 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.066510 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.075071 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.076422 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.085316 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-m6mbq" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.085484 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.086269 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.086599 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-zkwq2" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.096312 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-d9thq" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.098181 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.115327 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.117386 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.119732 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.122523 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-zhb7q" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.133137 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.163980 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.182228 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.183099 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.188484 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.189138 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-f2sz9" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.189273 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.190053 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.199518 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-wf2jp" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.210017 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87z7d\" (UniqueName: \"kubernetes.io/projected/0af2e6ef-277d-4022-b42b-5639b589fef9-kube-api-access-87z7d\") pod \"barbican-operator-controller-manager-868647ff47-k5c54\" (UID: \"0af2e6ef-277d-4022-b42b-5639b589fef9\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.210099 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sskjk\" (UniqueName: \"kubernetes.io/projected/b64009a1-83ef-4d66-bc6b-80ccfc6f7727-kube-api-access-sskjk\") pod \"cinder-operator-controller-manager-5d946d989d-x7bwr\" (UID: \"b64009a1-83ef-4d66-bc6b-80ccfc6f7727\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.210128 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xxth\" (UniqueName: \"kubernetes.io/projected/edbe95e5-3a5d-4dec-9a94-509234857155-kube-api-access-2xxth\") pod \"designate-operator-controller-manager-6d8bf5c495-mc64t\" (UID: \"edbe95e5-3a5d-4dec-9a94-509234857155\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.210614 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dxm5\" (UniqueName: \"kubernetes.io/projected/baba09d1-2238-4ca1-98ee-f44938b68cd3-kube-api-access-4dxm5\") pod \"glance-operator-controller-manager-77987464f4-hxv5g\" (UID: \"baba09d1-2238-4ca1-98ee-f44938b68cd3\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.210680 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxr5d\" (UniqueName: \"kubernetes.io/projected/019f7edd-1d9b-4069-a2a1-36bbe6b0a567-kube-api-access-lxr5d\") pod \"heat-operator-controller-manager-69f49c598c-qrsw8\" (UID: \"019f7edd-1d9b-4069-a2a1-36bbe6b0a567\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.238307 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.239490 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.242494 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.243641 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.248652 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.251662 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-zjmkn" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.255077 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.262687 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-b2d6t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.271441 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.279897 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.282294 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.285831 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.291186 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.291907 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.298132 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-8bmhz" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.298844 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-tbls9" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.300510 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.312930 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.317370 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-cj8np" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.329532 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwhl9\" (UniqueName: \"kubernetes.io/projected/c36983b4-b7f9-4834-85e9-a5c3cb83eb2d-kube-api-access-xwhl9\") pod \"ironic-operator-controller-manager-554564d7fc-rnh9h\" (UID: \"c36983b4-b7f9-4834-85e9-a5c3cb83eb2d\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.329652 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sskjk\" (UniqueName: \"kubernetes.io/projected/b64009a1-83ef-4d66-bc6b-80ccfc6f7727-kube-api-access-sskjk\") pod \"cinder-operator-controller-manager-5d946d989d-x7bwr\" (UID: \"b64009a1-83ef-4d66-bc6b-80ccfc6f7727\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.329697 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xxth\" (UniqueName: \"kubernetes.io/projected/edbe95e5-3a5d-4dec-9a94-509234857155-kube-api-access-2xxth\") pod \"designate-operator-controller-manager-6d8bf5c495-mc64t\" (UID: \"edbe95e5-3a5d-4dec-9a94-509234857155\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.331790 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5f69\" (UniqueName: \"kubernetes.io/projected/2c02c7e1-6f72-44be-a4fb-10ca1df420aa-kube-api-access-x5f69\") pod \"horizon-operator-controller-manager-5b9b8895d5-hncxm\" (UID: \"2c02c7e1-6f72-44be-a4fb-10ca1df420aa\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.331851 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dxm5\" (UniqueName: \"kubernetes.io/projected/baba09d1-2238-4ca1-98ee-f44938b68cd3-kube-api-access-4dxm5\") pod \"glance-operator-controller-manager-77987464f4-hxv5g\" (UID: \"baba09d1-2238-4ca1-98ee-f44938b68cd3\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.331930 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8xcp\" (UniqueName: \"kubernetes.io/projected/421f6539-4fcb-4949-ba29-34997fc98490-kube-api-access-w8xcp\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.331996 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxr5d\" (UniqueName: \"kubernetes.io/projected/019f7edd-1d9b-4069-a2a1-36bbe6b0a567-kube-api-access-lxr5d\") pod \"heat-operator-controller-manager-69f49c598c-qrsw8\" (UID: \"019f7edd-1d9b-4069-a2a1-36bbe6b0a567\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.332082 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87z7d\" (UniqueName: \"kubernetes.io/projected/0af2e6ef-277d-4022-b42b-5639b589fef9-kube-api-access-87z7d\") pod \"barbican-operator-controller-manager-868647ff47-k5c54\" (UID: \"0af2e6ef-277d-4022-b42b-5639b589fef9\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.332220 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.348122 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.370403 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxr5d\" (UniqueName: \"kubernetes.io/projected/019f7edd-1d9b-4069-a2a1-36bbe6b0a567-kube-api-access-lxr5d\") pod \"heat-operator-controller-manager-69f49c598c-qrsw8\" (UID: \"019f7edd-1d9b-4069-a2a1-36bbe6b0a567\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.372776 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dxm5\" (UniqueName: \"kubernetes.io/projected/baba09d1-2238-4ca1-98ee-f44938b68cd3-kube-api-access-4dxm5\") pod \"glance-operator-controller-manager-77987464f4-hxv5g\" (UID: \"baba09d1-2238-4ca1-98ee-f44938b68cd3\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.373532 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87z7d\" (UniqueName: \"kubernetes.io/projected/0af2e6ef-277d-4022-b42b-5639b589fef9-kube-api-access-87z7d\") pod \"barbican-operator-controller-manager-868647ff47-k5c54\" (UID: \"0af2e6ef-277d-4022-b42b-5639b589fef9\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.375888 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sskjk\" (UniqueName: \"kubernetes.io/projected/b64009a1-83ef-4d66-bc6b-80ccfc6f7727-kube-api-access-sskjk\") pod \"cinder-operator-controller-manager-5d946d989d-x7bwr\" (UID: \"b64009a1-83ef-4d66-bc6b-80ccfc6f7727\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.381709 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.387850 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xxth\" (UniqueName: \"kubernetes.io/projected/edbe95e5-3a5d-4dec-9a94-509234857155-kube-api-access-2xxth\") pod \"designate-operator-controller-manager-6d8bf5c495-mc64t\" (UID: \"edbe95e5-3a5d-4dec-9a94-509234857155\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.392610 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.393874 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.395624 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.401771 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.404371 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-kdjrp" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.410021 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.412426 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.434780 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5f69\" (UniqueName: \"kubernetes.io/projected/2c02c7e1-6f72-44be-a4fb-10ca1df420aa-kube-api-access-x5f69\") pod \"horizon-operator-controller-manager-5b9b8895d5-hncxm\" (UID: \"2c02c7e1-6f72-44be-a4fb-10ca1df420aa\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.434829 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67z9c\" (UniqueName: \"kubernetes.io/projected/57783601-5230-49ef-8ac2-0ddf78bd4b3a-kube-api-access-67z9c\") pod \"neutron-operator-controller-manager-64ddbf8bb-6t7g6\" (UID: \"57783601-5230-49ef-8ac2-0ddf78bd4b3a\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.434868 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8xcp\" (UniqueName: \"kubernetes.io/projected/421f6539-4fcb-4949-ba29-34997fc98490-kube-api-access-w8xcp\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.434892 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlwlx\" (UniqueName: \"kubernetes.io/projected/64ff9a64-f79f-4a45-943d-36152964cfcd-kube-api-access-xlwlx\") pod \"nova-operator-controller-manager-567668f5cf-wqp5t\" (UID: \"64ff9a64-f79f-4a45-943d-36152964cfcd\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.434919 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvwq5\" (UniqueName: \"kubernetes.io/projected/b37b04c7-5374-49d3-97c0-5b5b27c4a220-kube-api-access-zvwq5\") pod \"mariadb-operator-controller-manager-6994f66f48-8cljg\" (UID: \"b37b04c7-5374-49d3-97c0-5b5b27c4a220\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.434938 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd8gb\" (UniqueName: \"kubernetes.io/projected/766eebc1-05fc-4ca0-8c75-276632a6597e-kube-api-access-wd8gb\") pod \"manila-operator-controller-manager-54f6768c69-7qkx4\" (UID: \"766eebc1-05fc-4ca0-8c75-276632a6597e\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.434974 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.434994 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwhl9\" (UniqueName: \"kubernetes.io/projected/c36983b4-b7f9-4834-85e9-a5c3cb83eb2d-kube-api-access-xwhl9\") pod \"ironic-operator-controller-manager-554564d7fc-rnh9h\" (UID: \"c36983b4-b7f9-4834-85e9-a5c3cb83eb2d\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.435012 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsjl6\" (UniqueName: \"kubernetes.io/projected/db329f91-74f2-4baa-ab5a-85ad999fc8ef-kube-api-access-zsjl6\") pod \"keystone-operator-controller-manager-b4d948c87-x6wk7\" (UID: \"db329f91-74f2-4baa-ab5a-85ad999fc8ef\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.435218 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g" Feb 19 19:35:08 crc kubenswrapper[4722]: E0219 19:35:08.435695 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 19:35:08 crc kubenswrapper[4722]: E0219 19:35:08.435759 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert podName:421f6539-4fcb-4949-ba29-34997fc98490 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:08.935737771 +0000 UTC m=+1008.548088135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert") pod "infra-operator-controller-manager-79d975b745-q5kgj" (UID: "421f6539-4fcb-4949-ba29-34997fc98490") : secret "infra-operator-webhook-server-cert" not found Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.442946 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.444713 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.448215 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-fcjp9" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.448397 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.457470 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5f69\" (UniqueName: \"kubernetes.io/projected/2c02c7e1-6f72-44be-a4fb-10ca1df420aa-kube-api-access-x5f69\") pod \"horizon-operator-controller-manager-5b9b8895d5-hncxm\" (UID: \"2c02c7e1-6f72-44be-a4fb-10ca1df420aa\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.459714 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwhl9\" (UniqueName: \"kubernetes.io/projected/c36983b4-b7f9-4834-85e9-a5c3cb83eb2d-kube-api-access-xwhl9\") pod \"ironic-operator-controller-manager-554564d7fc-rnh9h\" (UID: \"c36983b4-b7f9-4834-85e9-a5c3cb83eb2d\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.466413 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.467326 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.468568 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8xcp\" (UniqueName: \"kubernetes.io/projected/421f6539-4fcb-4949-ba29-34997fc98490-kube-api-access-w8xcp\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.469491 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.472419 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-tmczn" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.477440 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.481236 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.483416 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-tgxpl" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.485301 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-wktqn"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.494165 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.495984 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.498096 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-fbsdz" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.509031 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-wktqn"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.518550 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.536329 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsjl6\" (UniqueName: \"kubernetes.io/projected/db329f91-74f2-4baa-ab5a-85ad999fc8ef-kube-api-access-zsjl6\") pod \"keystone-operator-controller-manager-b4d948c87-x6wk7\" (UID: \"db329f91-74f2-4baa-ab5a-85ad999fc8ef\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.536667 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf42n\" (UniqueName: \"kubernetes.io/projected/a6fb3554-24ea-4330-b2cb-1c91f105345d-kube-api-access-jf42n\") pod \"octavia-operator-controller-manager-69f8888797-zft4s\" (UID: \"a6fb3554-24ea-4330-b2cb-1c91f105345d\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.536691 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67z9c\" (UniqueName: \"kubernetes.io/projected/57783601-5230-49ef-8ac2-0ddf78bd4b3a-kube-api-access-67z9c\") pod \"neutron-operator-controller-manager-64ddbf8bb-6t7g6\" (UID: \"57783601-5230-49ef-8ac2-0ddf78bd4b3a\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.542292 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlwlx\" (UniqueName: \"kubernetes.io/projected/64ff9a64-f79f-4a45-943d-36152964cfcd-kube-api-access-xlwlx\") pod \"nova-operator-controller-manager-567668f5cf-wqp5t\" (UID: \"64ff9a64-f79f-4a45-943d-36152964cfcd\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.542357 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvwq5\" (UniqueName: \"kubernetes.io/projected/b37b04c7-5374-49d3-97c0-5b5b27c4a220-kube-api-access-zvwq5\") pod \"mariadb-operator-controller-manager-6994f66f48-8cljg\" (UID: \"b37b04c7-5374-49d3-97c0-5b5b27c4a220\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.542385 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd8gb\" (UniqueName: \"kubernetes.io/projected/766eebc1-05fc-4ca0-8c75-276632a6597e-kube-api-access-wd8gb\") pod \"manila-operator-controller-manager-54f6768c69-7qkx4\" (UID: \"766eebc1-05fc-4ca0-8c75-276632a6597e\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.551126 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.557011 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67z9c\" (UniqueName: \"kubernetes.io/projected/57783601-5230-49ef-8ac2-0ddf78bd4b3a-kube-api-access-67z9c\") pod \"neutron-operator-controller-manager-64ddbf8bb-6t7g6\" (UID: \"57783601-5230-49ef-8ac2-0ddf78bd4b3a\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.559700 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.560537 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.562631 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-hvpw7" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.568267 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsjl6\" (UniqueName: \"kubernetes.io/projected/db329f91-74f2-4baa-ab5a-85ad999fc8ef-kube-api-access-zsjl6\") pod \"keystone-operator-controller-manager-b4d948c87-x6wk7\" (UID: \"db329f91-74f2-4baa-ab5a-85ad999fc8ef\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.569294 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvwq5\" (UniqueName: \"kubernetes.io/projected/b37b04c7-5374-49d3-97c0-5b5b27c4a220-kube-api-access-zvwq5\") pod \"mariadb-operator-controller-manager-6994f66f48-8cljg\" (UID: \"b37b04c7-5374-49d3-97c0-5b5b27c4a220\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.570327 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd8gb\" (UniqueName: \"kubernetes.io/projected/766eebc1-05fc-4ca0-8c75-276632a6597e-kube-api-access-wd8gb\") pod \"manila-operator-controller-manager-54f6768c69-7qkx4\" (UID: \"766eebc1-05fc-4ca0-8c75-276632a6597e\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.576185 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlwlx\" (UniqueName: \"kubernetes.io/projected/64ff9a64-f79f-4a45-943d-36152964cfcd-kube-api-access-xlwlx\") pod \"nova-operator-controller-manager-567668f5cf-wqp5t\" (UID: \"64ff9a64-f79f-4a45-943d-36152964cfcd\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.577759 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.593126 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.619181 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.629562 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.640197 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.643337 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-dbdmf"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.644045 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwg4n\" (UniqueName: \"kubernetes.io/projected/820eede6-6396-4466-bf00-5d3b39d982d6-kube-api-access-cwg4n\") pod \"placement-operator-controller-manager-8497b45c89-mgzgq\" (UID: \"820eede6-6396-4466-bf00-5d3b39d982d6\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.644128 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.644190 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf42n\" (UniqueName: \"kubernetes.io/projected/a6fb3554-24ea-4330-b2cb-1c91f105345d-kube-api-access-jf42n\") pod \"octavia-operator-controller-manager-69f8888797-zft4s\" (UID: \"a6fb3554-24ea-4330-b2cb-1c91f105345d\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.644251 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lmjb\" (UniqueName: \"kubernetes.io/projected/29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8-kube-api-access-7lmjb\") pod \"swift-operator-controller-manager-68f46476f-wktqn\" (UID: \"29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.644311 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s64jz\" (UniqueName: \"kubernetes.io/projected/8870a7b1-f894-4429-9f52-d9063fe9c780-kube-api-access-s64jz\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.644315 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.644368 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6njp\" (UniqueName: \"kubernetes.io/projected/738a1346-88e9-4c4e-b7ce-1878736e2493-kube-api-access-h6njp\") pod \"ovn-operator-controller-manager-d44cf6b75-6dlqc\" (UID: \"738a1346-88e9-4c4e-b7ce-1878736e2493\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.646524 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-vl4wv" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.653267 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-dbdmf"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.659940 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.677840 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.679532 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf42n\" (UniqueName: \"kubernetes.io/projected/a6fb3554-24ea-4330-b2cb-1c91f105345d-kube-api-access-jf42n\") pod \"octavia-operator-controller-manager-69f8888797-zft4s\" (UID: \"a6fb3554-24ea-4330-b2cb-1c91f105345d\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.679618 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.682040 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-gbphg" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.684399 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.748813 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6njp\" (UniqueName: \"kubernetes.io/projected/738a1346-88e9-4c4e-b7ce-1878736e2493-kube-api-access-h6njp\") pod \"ovn-operator-controller-manager-d44cf6b75-6dlqc\" (UID: \"738a1346-88e9-4c4e-b7ce-1878736e2493\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.748895 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwg4n\" (UniqueName: \"kubernetes.io/projected/820eede6-6396-4466-bf00-5d3b39d982d6-kube-api-access-cwg4n\") pod \"placement-operator-controller-manager-8497b45c89-mgzgq\" (UID: \"820eede6-6396-4466-bf00-5d3b39d982d6\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.748917 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6skxh\" (UniqueName: \"kubernetes.io/projected/792a7a0a-a11e-42ce-a99b-e24127e7bbe8-kube-api-access-6skxh\") pod \"telemetry-operator-controller-manager-5484b6858b-7g48c\" (UID: \"792a7a0a-a11e-42ce-a99b-e24127e7bbe8\") " pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.748939 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.748955 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nqf2\" (UniqueName: \"kubernetes.io/projected/2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac-kube-api-access-6nqf2\") pod \"test-operator-controller-manager-7866795846-dbdmf\" (UID: \"2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac\") " pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.748994 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lmjb\" (UniqueName: \"kubernetes.io/projected/29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8-kube-api-access-7lmjb\") pod \"swift-operator-controller-manager-68f46476f-wktqn\" (UID: \"29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.749010 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s64jz\" (UniqueName: \"kubernetes.io/projected/8870a7b1-f894-4429-9f52-d9063fe9c780-kube-api-access-s64jz\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:08 crc kubenswrapper[4722]: E0219 19:35:08.751904 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:08 crc kubenswrapper[4722]: E0219 19:35:08.751988 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert podName:8870a7b1-f894-4429-9f52-d9063fe9c780 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:09.251948029 +0000 UTC m=+1008.864298353 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" (UID: "8870a7b1-f894-4429-9f52-d9063fe9c780") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.753114 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.761450 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.762316 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.768703 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.768939 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-4zvsr" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.769666 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.792625 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6njp\" (UniqueName: \"kubernetes.io/projected/738a1346-88e9-4c4e-b7ce-1878736e2493-kube-api-access-h6njp\") pod \"ovn-operator-controller-manager-d44cf6b75-6dlqc\" (UID: \"738a1346-88e9-4c4e-b7ce-1878736e2493\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.793496 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s64jz\" (UniqueName: \"kubernetes.io/projected/8870a7b1-f894-4429-9f52-d9063fe9c780-kube-api-access-s64jz\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.797269 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwg4n\" (UniqueName: \"kubernetes.io/projected/820eede6-6396-4466-bf00-5d3b39d982d6-kube-api-access-cwg4n\") pod \"placement-operator-controller-manager-8497b45c89-mgzgq\" (UID: \"820eede6-6396-4466-bf00-5d3b39d982d6\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.797498 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.797738 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.799296 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lmjb\" (UniqueName: \"kubernetes.io/projected/29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8-kube-api-access-7lmjb\") pod \"swift-operator-controller-manager-68f46476f-wktqn\" (UID: \"29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.815167 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.850436 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.850555 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrft7\" (UniqueName: \"kubernetes.io/projected/f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0-kube-api-access-nrft7\") pod \"watcher-operator-controller-manager-5db88f68c-zdfxj\" (UID: \"f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.850582 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6skxh\" (UniqueName: \"kubernetes.io/projected/792a7a0a-a11e-42ce-a99b-e24127e7bbe8-kube-api-access-6skxh\") pod \"telemetry-operator-controller-manager-5484b6858b-7g48c\" (UID: \"792a7a0a-a11e-42ce-a99b-e24127e7bbe8\") " pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.850608 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhqdp\" (UniqueName: \"kubernetes.io/projected/12f061e0-51af-4ab9-a8a7-26b2775651e1-kube-api-access-rhqdp\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.850647 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nqf2\" (UniqueName: \"kubernetes.io/projected/2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac-kube-api-access-6nqf2\") pod \"test-operator-controller-manager-7866795846-dbdmf\" (UID: \"2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac\") " pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.850666 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.855009 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.864287 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.865864 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.866315 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.867812 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-52tp7" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.871807 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6skxh\" (UniqueName: \"kubernetes.io/projected/792a7a0a-a11e-42ce-a99b-e24127e7bbe8-kube-api-access-6skxh\") pod \"telemetry-operator-controller-manager-5484b6858b-7g48c\" (UID: \"792a7a0a-a11e-42ce-a99b-e24127e7bbe8\") " pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.872142 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nqf2\" (UniqueName: \"kubernetes.io/projected/2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac-kube-api-access-6nqf2\") pod \"test-operator-controller-manager-7866795846-dbdmf\" (UID: \"2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac\") " pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.906891 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d"] Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.952931 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrft7\" (UniqueName: \"kubernetes.io/projected/f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0-kube-api-access-nrft7\") pod \"watcher-operator-controller-manager-5db88f68c-zdfxj\" (UID: \"f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.952999 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhqdp\" (UniqueName: \"kubernetes.io/projected/12f061e0-51af-4ab9-a8a7-26b2775651e1-kube-api-access-rhqdp\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.953049 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.953099 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfdzq\" (UniqueName: \"kubernetes.io/projected/65b17979-6c94-40e6-ac54-41a61a726e87-kube-api-access-sfdzq\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pjv7d\" (UID: \"65b17979-6c94-40e6-ac54-41a61a726e87\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.953178 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.953227 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:08 crc kubenswrapper[4722]: E0219 19:35:08.953379 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 19:35:08 crc kubenswrapper[4722]: E0219 19:35:08.953438 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert podName:421f6539-4fcb-4949-ba29-34997fc98490 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:09.953418959 +0000 UTC m=+1009.565769283 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert") pod "infra-operator-controller-manager-79d975b745-q5kgj" (UID: "421f6539-4fcb-4949-ba29-34997fc98490") : secret "infra-operator-webhook-server-cert" not found Feb 19 19:35:08 crc kubenswrapper[4722]: E0219 19:35:08.954219 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 19:35:08 crc kubenswrapper[4722]: E0219 19:35:08.954253 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs podName:12f061e0-51af-4ab9-a8a7-26b2775651e1 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:09.454242614 +0000 UTC m=+1009.066592938 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs") pod "openstack-operator-controller-manager-5f8cf67456-vwhlj" (UID: "12f061e0-51af-4ab9-a8a7-26b2775651e1") : secret "webhook-server-cert" not found Feb 19 19:35:08 crc kubenswrapper[4722]: E0219 19:35:08.954344 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 19:35:08 crc kubenswrapper[4722]: E0219 19:35:08.954438 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs podName:12f061e0-51af-4ab9-a8a7-26b2775651e1 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:09.4544196 +0000 UTC m=+1009.066769924 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs") pod "openstack-operator-controller-manager-5f8cf67456-vwhlj" (UID: "12f061e0-51af-4ab9-a8a7-26b2775651e1") : secret "metrics-server-cert" not found Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.991926 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrft7\" (UniqueName: \"kubernetes.io/projected/f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0-kube-api-access-nrft7\") pod \"watcher-operator-controller-manager-5db88f68c-zdfxj\" (UID: \"f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" Feb 19 19:35:08 crc kubenswrapper[4722]: I0219 19:35:08.994089 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq" Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.026638 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.037003 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhqdp\" (UniqueName: \"kubernetes.io/projected/12f061e0-51af-4ab9-a8a7-26b2775651e1-kube-api-access-rhqdp\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.059773 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfdzq\" (UniqueName: \"kubernetes.io/projected/65b17979-6c94-40e6-ac54-41a61a726e87-kube-api-access-sfdzq\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pjv7d\" (UID: \"65b17979-6c94-40e6-ac54-41a61a726e87\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d" Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.069490 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.098999 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.104551 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfdzq\" (UniqueName: \"kubernetes.io/projected/65b17979-6c94-40e6-ac54-41a61a726e87-kube-api-access-sfdzq\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pjv7d\" (UID: \"65b17979-6c94-40e6-ac54-41a61a726e87\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d" Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.136826 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.147974 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t"] Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.168382 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8"] Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.208397 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d" Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.262122 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:09 crc kubenswrapper[4722]: E0219 19:35:09.262360 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:09 crc kubenswrapper[4722]: E0219 19:35:09.262407 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert podName:8870a7b1-f894-4429-9f52-d9063fe9c780 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:10.26239153 +0000 UTC m=+1009.874741854 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" (UID: "8870a7b1-f894-4429-9f52-d9063fe9c780") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.291543 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t" event={"ID":"edbe95e5-3a5d-4dec-9a94-509234857155","Type":"ContainerStarted","Data":"737a9653409a2ed747e71d9e9d0ff2d724ec46e51dcb4a7f4f3a696444b09f80"} Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.342776 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g"] Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.397924 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h"] Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.464922 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.465063 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:09 crc kubenswrapper[4722]: E0219 19:35:09.465251 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 19:35:09 crc kubenswrapper[4722]: E0219 19:35:09.465312 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs podName:12f061e0-51af-4ab9-a8a7-26b2775651e1 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:10.465292055 +0000 UTC m=+1010.077642379 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs") pod "openstack-operator-controller-manager-5f8cf67456-vwhlj" (UID: "12f061e0-51af-4ab9-a8a7-26b2775651e1") : secret "webhook-server-cert" not found Feb 19 19:35:09 crc kubenswrapper[4722]: E0219 19:35:09.465314 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 19:35:09 crc kubenswrapper[4722]: E0219 19:35:09.465381 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs podName:12f061e0-51af-4ab9-a8a7-26b2775651e1 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:10.465362287 +0000 UTC m=+1010.077712611 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs") pod "openstack-operator-controller-manager-5f8cf67456-vwhlj" (UID: "12f061e0-51af-4ab9-a8a7-26b2775651e1") : secret "metrics-server-cert" not found Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.753449 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7"] Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.765842 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54"] Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.783984 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr"] Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.915567 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6"] Feb 19 19:35:09 crc kubenswrapper[4722]: W0219 19:35:09.919836 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57783601_5230_49ef_8ac2_0ddf78bd4b3a.slice/crio-6095a935ed7e763a9a202ff9d935988e6a0840b0d401a92872b37fdc02763a9b WatchSource:0}: Error finding container 6095a935ed7e763a9a202ff9d935988e6a0840b0d401a92872b37fdc02763a9b: Status 404 returned error can't find the container with id 6095a935ed7e763a9a202ff9d935988e6a0840b0d401a92872b37fdc02763a9b Feb 19 19:35:09 crc kubenswrapper[4722]: I0219 19:35:09.976398 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:09 crc kubenswrapper[4722]: E0219 19:35:09.976550 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 19:35:09 crc kubenswrapper[4722]: E0219 19:35:09.976615 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert podName:421f6539-4fcb-4949-ba29-34997fc98490 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:11.976599963 +0000 UTC m=+1011.588950277 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert") pod "infra-operator-controller-manager-79d975b745-q5kgj" (UID: "421f6539-4fcb-4949-ba29-34997fc98490") : secret "infra-operator-webhook-server-cert" not found Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.118048 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4"] Feb 19 19:35:10 crc kubenswrapper[4722]: W0219 19:35:10.125339 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb37b04c7_5374_49d3_97c0_5b5b27c4a220.slice/crio-defb2ae2adaaf2cd0a7459bb8506d8890fdbe78a3aa9d3ff0449899e1c15c563 WatchSource:0}: Error finding container defb2ae2adaaf2cd0a7459bb8506d8890fdbe78a3aa9d3ff0449899e1c15c563: Status 404 returned error can't find the container with id defb2ae2adaaf2cd0a7459bb8506d8890fdbe78a3aa9d3ff0449899e1c15c563 Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.125556 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg"] Feb 19 19:35:10 crc kubenswrapper[4722]: W0219 19:35:10.126279 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65b17979_6c94_40e6_ac54_41a61a726e87.slice/crio-9854cee110585702449268049c8880c7e765ae3df0402d3e14905c29bfe504e9 WatchSource:0}: Error finding container 9854cee110585702449268049c8880c7e765ae3df0402d3e14905c29bfe504e9: Status 404 returned error can't find the container with id 9854cee110585702449268049c8880c7e765ae3df0402d3e14905c29bfe504e9 Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.131341 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d"] Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.135657 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm"] Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.140030 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq"] Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.160294 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s"] Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.169923 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj"] Feb 19 19:35:10 crc kubenswrapper[4722]: W0219 19:35:10.183474 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6fb3554_24ea_4330_b2cb_1c91f105345d.slice/crio-8094ccbdad28f194ca0d1bd20e1575a47cb286404c119fd7c6fba62b1e12b207 WatchSource:0}: Error finding container 8094ccbdad28f194ca0d1bd20e1575a47cb286404c119fd7c6fba62b1e12b207: Status 404 returned error can't find the container with id 8094ccbdad28f194ca0d1bd20e1575a47cb286404c119fd7c6fba62b1e12b207 Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.201714 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x5f69,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-5b9b8895d5-hncxm_openstack-operators(2c02c7e1-6f72-44be-a4fb-10ca1df420aa): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.209958 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" podUID="2c02c7e1-6f72-44be-a4fb-10ca1df420aa" Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.281101 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.281347 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.281408 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert podName:8870a7b1-f894-4429-9f52-d9063fe9c780 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:12.281389464 +0000 UTC m=+1011.893739788 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" (UID: "8870a7b1-f894-4429-9f52-d9063fe9c780") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.309523 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-dbdmf"] Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.312800 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" event={"ID":"2c02c7e1-6f72-44be-a4fb-10ca1df420aa","Type":"ContainerStarted","Data":"88a0d923d1f4f45462f59caff291c2d46d7aec53ff989922ac41ce29b90d0431"} Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.313705 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" podUID="2c02c7e1-6f72-44be-a4fb-10ca1df420aa" Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.313952 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t"] Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.314956 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" event={"ID":"a6fb3554-24ea-4330-b2cb-1c91f105345d","Type":"ContainerStarted","Data":"8094ccbdad28f194ca0d1bd20e1575a47cb286404c119fd7c6fba62b1e12b207"} Feb 19 19:35:10 crc kubenswrapper[4722]: W0219 19:35:10.322331 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64ff9a64_f79f_4a45_943d_36152964cfcd.slice/crio-724fb1f64dd2b17a9b001d85a88816639278e6c666dc5366dbf5b675d2e76d82 WatchSource:0}: Error finding container 724fb1f64dd2b17a9b001d85a88816639278e6c666dc5366dbf5b675d2e76d82: Status 404 returned error can't find the container with id 724fb1f64dd2b17a9b001d85a88816639278e6c666dc5366dbf5b675d2e76d82 Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.322662 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" event={"ID":"57783601-5230-49ef-8ac2-0ddf78bd4b3a","Type":"ContainerStarted","Data":"6095a935ed7e763a9a202ff9d935988e6a0840b0d401a92872b37fdc02763a9b"} Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.324322 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq" event={"ID":"820eede6-6396-4466-bf00-5d3b39d982d6","Type":"ContainerStarted","Data":"17fb740d71db5c3594c816c5ccaa09c5afbf67be88fdd88e1ac550ff74c2deb6"} Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.334227 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-wktqn"] Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.336931 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d" event={"ID":"65b17979-6c94-40e6-ac54-41a61a726e87","Type":"ContainerStarted","Data":"9854cee110585702449268049c8880c7e765ae3df0402d3e14905c29bfe504e9"} Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.338411 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" event={"ID":"766eebc1-05fc-4ca0-8c75-276632a6597e","Type":"ContainerStarted","Data":"b1b2d484e355b55c092d25a3a139271cf0cd78ecc1c8de758660f56a7e1fd34b"} Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.343486 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7" event={"ID":"db329f91-74f2-4baa-ab5a-85ad999fc8ef","Type":"ContainerStarted","Data":"80c6af299b43c30d0772a91d8590b548a277a17a3b240bf9f9a325e6e31ae74e"} Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.343957 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6nqf2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-dbdmf_openstack-operators(2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.346419 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" podUID="2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac" Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.347450 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr" event={"ID":"b64009a1-83ef-4d66-bc6b-80ccfc6f7727","Type":"ContainerStarted","Data":"e63c5aac3215063983439da9853bc504afc8d16b9b380b482608e7e4b3e0f990"} Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.357588 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c"] Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.358432 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.17:5001/openstack-k8s-operators/telemetry-operator:b83e41b73854b29b5d6860a1d9e9ac7611640781,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6skxh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5484b6858b-7g48c_openstack-operators(792a7a0a-a11e-42ce-a99b-e24127e7bbe8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.359040 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" event={"ID":"b37b04c7-5374-49d3-97c0-5b5b27c4a220","Type":"ContainerStarted","Data":"defb2ae2adaaf2cd0a7459bb8506d8890fdbe78a3aa9d3ff0449899e1c15c563"} Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.359862 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" podUID="792a7a0a-a11e-42ce-a99b-e24127e7bbe8" Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.361832 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" event={"ID":"c36983b4-b7f9-4834-85e9-a5c3cb83eb2d","Type":"ContainerStarted","Data":"ffc3d12e367cebd1cfb82440cd3115cefabcea3a1340171d9ec9798cc1d9e90e"} Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.363354 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8" event={"ID":"019f7edd-1d9b-4069-a2a1-36bbe6b0a567","Type":"ContainerStarted","Data":"13c6429d8da153184ee95dcd6ec0c902b204ffb2db95645138a02a88428298c4"} Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.364047 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g" event={"ID":"baba09d1-2238-4ca1-98ee-f44938b68cd3","Type":"ContainerStarted","Data":"2149a0615d3124e22a80a9aeb9434ab419be0511f46fc8bde6386401cf205fab"} Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.387700 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54" event={"ID":"0af2e6ef-277d-4022-b42b-5639b589fef9","Type":"ContainerStarted","Data":"30c7d8cc11849ef878a5bbae8ba84f80de9816ad9cc6b57efc3acf91422921bb"} Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.388770 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7lmjb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-wktqn_openstack-operators(29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.390277 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" event={"ID":"f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0","Type":"ContainerStarted","Data":"c7fc3e072fdb34adcb59876dc21c99e36b93d371565d45a283a079b2d908006d"} Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.390305 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" podUID="29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8" Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.392449 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h6njp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-6dlqc_openstack-operators(738a1346-88e9-4c4e-b7ce-1878736e2493): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.393782 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" podUID="738a1346-88e9-4c4e-b7ce-1878736e2493" Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.444723 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc"] Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.484225 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.484414 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.484485 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs podName:12f061e0-51af-4ab9-a8a7-26b2775651e1 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:12.484468384 +0000 UTC m=+1012.096818708 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs") pod "openstack-operator-controller-manager-5f8cf67456-vwhlj" (UID: "12f061e0-51af-4ab9-a8a7-26b2775651e1") : secret "metrics-server-cert" not found Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.484521 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 19:35:10 crc kubenswrapper[4722]: E0219 19:35:10.484573 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs podName:12f061e0-51af-4ab9-a8a7-26b2775651e1 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:12.484559137 +0000 UTC m=+1012.096909451 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs") pod "openstack-operator-controller-manager-5f8cf67456-vwhlj" (UID: "12f061e0-51af-4ab9-a8a7-26b2775651e1") : secret "webhook-server-cert" not found Feb 19 19:35:10 crc kubenswrapper[4722]: I0219 19:35:10.484419 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:11 crc kubenswrapper[4722]: I0219 19:35:11.407755 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" event={"ID":"64ff9a64-f79f-4a45-943d-36152964cfcd","Type":"ContainerStarted","Data":"724fb1f64dd2b17a9b001d85a88816639278e6c666dc5366dbf5b675d2e76d82"} Feb 19 19:35:11 crc kubenswrapper[4722]: I0219 19:35:11.410225 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" event={"ID":"29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8","Type":"ContainerStarted","Data":"edc8f542c0b4375ecca22ccf3c0080c11524922334b8014465b5b2037358f125"} Feb 19 19:35:11 crc kubenswrapper[4722]: I0219 19:35:11.411483 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" event={"ID":"738a1346-88e9-4c4e-b7ce-1878736e2493","Type":"ContainerStarted","Data":"c57de2de3f17acfe662d63880d81fd9cd3117e55a8eb320b063c6decf3af5b86"} Feb 19 19:35:11 crc kubenswrapper[4722]: E0219 19:35:11.412373 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" podUID="29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8" Feb 19 19:35:11 crc kubenswrapper[4722]: E0219 19:35:11.413371 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" podUID="738a1346-88e9-4c4e-b7ce-1878736e2493" Feb 19 19:35:11 crc kubenswrapper[4722]: I0219 19:35:11.414735 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" event={"ID":"792a7a0a-a11e-42ce-a99b-e24127e7bbe8","Type":"ContainerStarted","Data":"8b7ef117b102669f6e7ea2cb36f9491a15af28ec5bf3aaac6435515beb9d51bd"} Feb 19 19:35:11 crc kubenswrapper[4722]: E0219 19:35:11.416495 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.17:5001/openstack-k8s-operators/telemetry-operator:b83e41b73854b29b5d6860a1d9e9ac7611640781\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" podUID="792a7a0a-a11e-42ce-a99b-e24127e7bbe8" Feb 19 19:35:11 crc kubenswrapper[4722]: I0219 19:35:11.423838 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" event={"ID":"2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac","Type":"ContainerStarted","Data":"d80c15863f489995e056ec96bf2ea5143b89334f5bceadf92e59c7b37cbc8120"} Feb 19 19:35:11 crc kubenswrapper[4722]: E0219 19:35:11.430884 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" podUID="2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac" Feb 19 19:35:11 crc kubenswrapper[4722]: E0219 19:35:11.433171 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" podUID="2c02c7e1-6f72-44be-a4fb-10ca1df420aa" Feb 19 19:35:11 crc kubenswrapper[4722]: I0219 19:35:11.798849 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:35:11 crc kubenswrapper[4722]: I0219 19:35:11.798929 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:35:12 crc kubenswrapper[4722]: I0219 19:35:12.034971 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.035182 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.035226 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert podName:421f6539-4fcb-4949-ba29-34997fc98490 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:16.035213063 +0000 UTC m=+1015.647563377 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert") pod "infra-operator-controller-manager-79d975b745-q5kgj" (UID: "421f6539-4fcb-4949-ba29-34997fc98490") : secret "infra-operator-webhook-server-cert" not found Feb 19 19:35:12 crc kubenswrapper[4722]: I0219 19:35:12.349970 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.350172 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.350263 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert podName:8870a7b1-f894-4429-9f52-d9063fe9c780 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:16.350239063 +0000 UTC m=+1015.962589427 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" (UID: "8870a7b1-f894-4429-9f52-d9063fe9c780") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.435597 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" podUID="2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac" Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.435695 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" podUID="738a1346-88e9-4c4e-b7ce-1878736e2493" Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.438680 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" podUID="29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8" Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.446338 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.17:5001/openstack-k8s-operators/telemetry-operator:b83e41b73854b29b5d6860a1d9e9ac7611640781\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" podUID="792a7a0a-a11e-42ce-a99b-e24127e7bbe8" Feb 19 19:35:12 crc kubenswrapper[4722]: I0219 19:35:12.553700 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:12 crc kubenswrapper[4722]: I0219 19:35:12.553843 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.553947 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.554019 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.554061 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs podName:12f061e0-51af-4ab9-a8a7-26b2775651e1 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:16.554038136 +0000 UTC m=+1016.166388460 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs") pod "openstack-operator-controller-manager-5f8cf67456-vwhlj" (UID: "12f061e0-51af-4ab9-a8a7-26b2775651e1") : secret "metrics-server-cert" not found Feb 19 19:35:12 crc kubenswrapper[4722]: E0219 19:35:12.554091 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs podName:12f061e0-51af-4ab9-a8a7-26b2775651e1 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:16.554071057 +0000 UTC m=+1016.166421371 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs") pod "openstack-operator-controller-manager-5f8cf67456-vwhlj" (UID: "12f061e0-51af-4ab9-a8a7-26b2775651e1") : secret "webhook-server-cert" not found Feb 19 19:35:16 crc kubenswrapper[4722]: I0219 19:35:16.111255 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:16 crc kubenswrapper[4722]: E0219 19:35:16.111439 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 19:35:16 crc kubenswrapper[4722]: E0219 19:35:16.111791 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert podName:421f6539-4fcb-4949-ba29-34997fc98490 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:24.111772487 +0000 UTC m=+1023.724122811 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert") pod "infra-operator-controller-manager-79d975b745-q5kgj" (UID: "421f6539-4fcb-4949-ba29-34997fc98490") : secret "infra-operator-webhook-server-cert" not found Feb 19 19:35:16 crc kubenswrapper[4722]: I0219 19:35:16.416306 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:16 crc kubenswrapper[4722]: E0219 19:35:16.416492 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:16 crc kubenswrapper[4722]: E0219 19:35:16.416605 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert podName:8870a7b1-f894-4429-9f52-d9063fe9c780 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:24.416579308 +0000 UTC m=+1024.028929662 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" (UID: "8870a7b1-f894-4429-9f52-d9063fe9c780") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:16 crc kubenswrapper[4722]: I0219 19:35:16.618944 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:16 crc kubenswrapper[4722]: I0219 19:35:16.619019 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:16 crc kubenswrapper[4722]: E0219 19:35:16.619192 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 19:35:16 crc kubenswrapper[4722]: E0219 19:35:16.619244 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs podName:12f061e0-51af-4ab9-a8a7-26b2775651e1 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:24.619231245 +0000 UTC m=+1024.231581569 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs") pod "openstack-operator-controller-manager-5f8cf67456-vwhlj" (UID: "12f061e0-51af-4ab9-a8a7-26b2775651e1") : secret "metrics-server-cert" not found Feb 19 19:35:16 crc kubenswrapper[4722]: E0219 19:35:16.619253 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 19:35:16 crc kubenswrapper[4722]: E0219 19:35:16.619396 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs podName:12f061e0-51af-4ab9-a8a7-26b2775651e1 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:24.619375049 +0000 UTC m=+1024.231725433 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs") pod "openstack-operator-controller-manager-5f8cf67456-vwhlj" (UID: "12f061e0-51af-4ab9-a8a7-26b2775651e1") : secret "webhook-server-cert" not found Feb 19 19:35:23 crc kubenswrapper[4722]: E0219 19:35:23.001444 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf" Feb 19 19:35:23 crc kubenswrapper[4722]: E0219 19:35:23.002095 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-67z9c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64ddbf8bb-6t7g6_openstack-operators(57783601-5230-49ef-8ac2-0ddf78bd4b3a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:35:23 crc kubenswrapper[4722]: E0219 19:35:23.003965 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" podUID="57783601-5230-49ef-8ac2-0ddf78bd4b3a" Feb 19 19:35:23 crc kubenswrapper[4722]: E0219 19:35:23.491968 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a" Feb 19 19:35:23 crc kubenswrapper[4722]: E0219 19:35:23.492191 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zvwq5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6994f66f48-8cljg_openstack-operators(b37b04c7-5374-49d3-97c0-5b5b27c4a220): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:35:23 crc kubenswrapper[4722]: E0219 19:35:23.493712 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" podUID="b37b04c7-5374-49d3-97c0-5b5b27c4a220" Feb 19 19:35:23 crc kubenswrapper[4722]: E0219 19:35:23.520298 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" podUID="b37b04c7-5374-49d3-97c0-5b5b27c4a220" Feb 19 19:35:23 crc kubenswrapper[4722]: E0219 19:35:23.520635 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" podUID="57783601-5230-49ef-8ac2-0ddf78bd4b3a" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.137539 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.147883 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/421f6539-4fcb-4949-ba29-34997fc98490-cert\") pod \"infra-operator-controller-manager-79d975b745-q5kgj\" (UID: \"421f6539-4fcb-4949-ba29-34997fc98490\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:24 crc kubenswrapper[4722]: E0219 19:35:24.257452 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34" Feb 19 19:35:24 crc kubenswrapper[4722]: E0219 19:35:24.257702 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jf42n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-69f8888797-zft4s_openstack-operators(a6fb3554-24ea-4330-b2cb-1c91f105345d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:35:24 crc kubenswrapper[4722]: E0219 19:35:24.258976 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" podUID="a6fb3554-24ea-4330-b2cb-1c91f105345d" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.410391 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-f2sz9" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.418056 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.441363 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:24 crc kubenswrapper[4722]: E0219 19:35:24.441542 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:24 crc kubenswrapper[4722]: E0219 19:35:24.441624 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert podName:8870a7b1-f894-4429-9f52-d9063fe9c780 nodeName:}" failed. No retries permitted until 2026-02-19 19:35:40.441603531 +0000 UTC m=+1040.053953855 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" (UID: "8870a7b1-f894-4429-9f52-d9063fe9c780") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 19:35:24 crc kubenswrapper[4722]: E0219 19:35:24.527283 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" podUID="a6fb3554-24ea-4330-b2cb-1c91f105345d" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.644996 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.645132 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.650710 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-metrics-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.668081 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/12f061e0-51af-4ab9-a8a7-26b2775651e1-webhook-certs\") pod \"openstack-operator-controller-manager-5f8cf67456-vwhlj\" (UID: \"12f061e0-51af-4ab9-a8a7-26b2775651e1\") " pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.775245 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-4zvsr" Feb 19 19:35:24 crc kubenswrapper[4722]: I0219 19:35:24.783809 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:26 crc kubenswrapper[4722]: E0219 19:35:26.383920 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0" Feb 19 19:35:26 crc kubenswrapper[4722]: E0219 19:35:26.384654 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nrft7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-zdfxj_openstack-operators(f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:35:26 crc kubenswrapper[4722]: E0219 19:35:26.386295 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" podUID="f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0" Feb 19 19:35:26 crc kubenswrapper[4722]: E0219 19:35:26.539626 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" podUID="f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0" Feb 19 19:35:26 crc kubenswrapper[4722]: E0219 19:35:26.952144 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867" Feb 19 19:35:26 crc kubenswrapper[4722]: E0219 19:35:26.952525 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xwhl9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-554564d7fc-rnh9h_openstack-operators(c36983b4-b7f9-4834-85e9-a5c3cb83eb2d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:35:26 crc kubenswrapper[4722]: E0219 19:35:26.953849 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" podUID="c36983b4-b7f9-4834-85e9-a5c3cb83eb2d" Feb 19 19:35:27 crc kubenswrapper[4722]: E0219 19:35:27.547922 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" podUID="c36983b4-b7f9-4834-85e9-a5c3cb83eb2d" Feb 19 19:35:27 crc kubenswrapper[4722]: E0219 19:35:27.748998 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 19 19:35:27 crc kubenswrapper[4722]: E0219 19:35:27.749180 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xlwlx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-wqp5t_openstack-operators(64ff9a64-f79f-4a45-943d-36152964cfcd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:35:27 crc kubenswrapper[4722]: E0219 19:35:27.750437 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" podUID="64ff9a64-f79f-4a45-943d-36152964cfcd" Feb 19 19:35:28 crc kubenswrapper[4722]: E0219 19:35:28.554522 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" podUID="64ff9a64-f79f-4a45-943d-36152964cfcd" Feb 19 19:35:28 crc kubenswrapper[4722]: E0219 19:35:28.840208 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c" Feb 19 19:35:28 crc kubenswrapper[4722]: E0219 19:35:28.840736 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wd8gb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-54f6768c69-7qkx4_openstack-operators(766eebc1-05fc-4ca0-8c75-276632a6597e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:35:28 crc kubenswrapper[4722]: E0219 19:35:28.843195 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" podUID="766eebc1-05fc-4ca0-8c75-276632a6597e" Feb 19 19:35:29 crc kubenswrapper[4722]: E0219 19:35:29.303696 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Feb 19 19:35:29 crc kubenswrapper[4722]: E0219 19:35:29.303846 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sfdzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-pjv7d_openstack-operators(65b17979-6c94-40e6-ac54-41a61a726e87): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:35:29 crc kubenswrapper[4722]: E0219 19:35:29.305122 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d" podUID="65b17979-6c94-40e6-ac54-41a61a726e87" Feb 19 19:35:29 crc kubenswrapper[4722]: E0219 19:35:29.560836 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c\\\"\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" podUID="766eebc1-05fc-4ca0-8c75-276632a6597e" Feb 19 19:35:29 crc kubenswrapper[4722]: E0219 19:35:29.561317 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d" podUID="65b17979-6c94-40e6-ac54-41a61a726e87" Feb 19 19:35:37 crc kubenswrapper[4722]: I0219 19:35:37.873173 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj"] Feb 19 19:35:37 crc kubenswrapper[4722]: I0219 19:35:37.926352 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj"] Feb 19 19:35:37 crc kubenswrapper[4722]: W0219 19:35:37.947771 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12f061e0_51af_4ab9_a8a7_26b2775651e1.slice/crio-04691f0ff38d298ef9dd378749487d75d9f831cb45abea9ef3befc6b3b5ae6e4 WatchSource:0}: Error finding container 04691f0ff38d298ef9dd378749487d75d9f831cb45abea9ef3befc6b3b5ae6e4: Status 404 returned error can't find the container with id 04691f0ff38d298ef9dd378749487d75d9f831cb45abea9ef3befc6b3b5ae6e4 Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.648689 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t" event={"ID":"edbe95e5-3a5d-4dec-9a94-509234857155","Type":"ContainerStarted","Data":"4487cf700dbcf431a1d68b2c6ea8a0076fe01544c2a6a6ae5f033cbb2d122abc"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.649580 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.662831 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7" event={"ID":"db329f91-74f2-4baa-ab5a-85ad999fc8ef","Type":"ContainerStarted","Data":"4472e0683e8b574a01c4e15d88578e8633e04850588202934771e6a2438d064b"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.663018 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.680594 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" event={"ID":"738a1346-88e9-4c4e-b7ce-1878736e2493","Type":"ContainerStarted","Data":"c5a84a777980c9b91618fc57728e85cfb33c312ce3f62a262306c9e4e0345dd9"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.681330 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.694471 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54" event={"ID":"0af2e6ef-277d-4022-b42b-5639b589fef9","Type":"ContainerStarted","Data":"1ac8c6f3f1f9fb0a9846964c89b7023c08c4efde23f1eae391e56ffa766266e5"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.695712 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.701007 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8" event={"ID":"019f7edd-1d9b-4069-a2a1-36bbe6b0a567","Type":"ContainerStarted","Data":"ee31418e17898155a985a1968fecb7f9d9603894a860b0011d6cef67fe77d272"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.707256 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.720794 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" event={"ID":"2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac","Type":"ContainerStarted","Data":"715825cd8635f30e5ca4361acce74a04e2b838bed261bf1eec5e2bd2a7bfe4a6"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.734246 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.734712 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t" podStartSLOduration=10.685434639 podStartE2EDuration="30.734703318s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:09.245433561 +0000 UTC m=+1008.857783885" lastFinishedPulling="2026-02-19 19:35:29.29470224 +0000 UTC m=+1028.907052564" observedRunningTime="2026-02-19 19:35:38.734003366 +0000 UTC m=+1038.346353690" watchObservedRunningTime="2026-02-19 19:35:38.734703318 +0000 UTC m=+1038.347053642" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.746031 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" event={"ID":"2c02c7e1-6f72-44be-a4fb-10ca1df420aa","Type":"ContainerStarted","Data":"3d535cda1b0c8c51f72486f4353c197b016a10cbac2eedf9e290116c8c69c370"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.747015 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.787053 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" event={"ID":"12f061e0-51af-4ab9-a8a7-26b2775651e1","Type":"ContainerStarted","Data":"06677fea9cfd054dd44ccb373c1289a9ec88d0e4622ac4f8ecc38a445986540a"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.787095 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" event={"ID":"12f061e0-51af-4ab9-a8a7-26b2775651e1","Type":"ContainerStarted","Data":"04691f0ff38d298ef9dd378749487d75d9f831cb45abea9ef3befc6b3b5ae6e4"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.787656 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.837143 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr" event={"ID":"b64009a1-83ef-4d66-bc6b-80ccfc6f7727","Type":"ContainerStarted","Data":"80f93f3f8813f9698f7662a057f164a7481334cf04f7871173fb2dc5ec4ad8a7"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.837206 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.847229 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" podStartSLOduration=3.691088167 podStartE2EDuration="30.847213716s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.390189725 +0000 UTC m=+1010.002540049" lastFinishedPulling="2026-02-19 19:35:37.546315264 +0000 UTC m=+1037.158665598" observedRunningTime="2026-02-19 19:35:38.789440624 +0000 UTC m=+1038.401790948" watchObservedRunningTime="2026-02-19 19:35:38.847213716 +0000 UTC m=+1038.459564040" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.847913 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8" podStartSLOduration=10.857382340000001 podStartE2EDuration="30.847909037s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:09.292809808 +0000 UTC m=+1008.905160132" lastFinishedPulling="2026-02-19 19:35:29.283336505 +0000 UTC m=+1028.895686829" observedRunningTime="2026-02-19 19:35:38.842666003 +0000 UTC m=+1038.455016337" watchObservedRunningTime="2026-02-19 19:35:38.847909037 +0000 UTC m=+1038.460259361" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.853971 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" event={"ID":"421f6539-4fcb-4949-ba29-34997fc98490","Type":"ContainerStarted","Data":"879a20f9d3ebd9c5640fe2cc700408edc14d4f73100e04e6e46aba7e09b03919"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.864627 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" event={"ID":"792a7a0a-a11e-42ce-a99b-e24127e7bbe8","Type":"ContainerStarted","Data":"895f73332c259f575167ccd7c2f4c99c0df02eeed6dfa15598e840c070aee417"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.865224 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.885435 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54" podStartSLOduration=12.375692386 podStartE2EDuration="31.885417436s" podCreationTimestamp="2026-02-19 19:35:07 +0000 UTC" firstStartedPulling="2026-02-19 19:35:09.784944509 +0000 UTC m=+1009.397294833" lastFinishedPulling="2026-02-19 19:35:29.294669559 +0000 UTC m=+1028.907019883" observedRunningTime="2026-02-19 19:35:38.883556089 +0000 UTC m=+1038.495906413" watchObservedRunningTime="2026-02-19 19:35:38.885417436 +0000 UTC m=+1038.497767760" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.892370 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g" event={"ID":"baba09d1-2238-4ca1-98ee-f44938b68cd3","Type":"ContainerStarted","Data":"a1fb15869401aae08d17f4bf12a1969f0cc4c8c77d9e47b782efab7d76fc54bf"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.892876 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.931863 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" podStartSLOduration=3.718078847 podStartE2EDuration="30.931849863s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.343850931 +0000 UTC m=+1009.956201245" lastFinishedPulling="2026-02-19 19:35:37.557621937 +0000 UTC m=+1037.169972261" observedRunningTime="2026-02-19 19:35:38.929393427 +0000 UTC m=+1038.541743751" watchObservedRunningTime="2026-02-19 19:35:38.931849863 +0000 UTC m=+1038.544200187" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.950385 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" event={"ID":"29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8","Type":"ContainerStarted","Data":"7358b3ea5bc9182b4c1e8bec442955fc6746af482a0c956a6c321ee2d9a44602"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.951067 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.955809 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7" podStartSLOduration=5.607891316 podStartE2EDuration="30.95579352s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:09.758325799 +0000 UTC m=+1009.370676123" lastFinishedPulling="2026-02-19 19:35:35.106227963 +0000 UTC m=+1034.718578327" observedRunningTime="2026-02-19 19:35:38.955761939 +0000 UTC m=+1038.568112263" watchObservedRunningTime="2026-02-19 19:35:38.95579352 +0000 UTC m=+1038.568143844" Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.964836 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq" event={"ID":"820eede6-6396-4466-bf00-5d3b39d982d6","Type":"ContainerStarted","Data":"26a3364052ca45767994b680e3619d00780c066d9c7e579ff0b47e2405a21d62"} Feb 19 19:35:38 crc kubenswrapper[4722]: I0219 19:35:38.965551 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.045136 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" podStartSLOduration=31.045115074 podStartE2EDuration="31.045115074s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:35:39.043784472 +0000 UTC m=+1038.656134796" watchObservedRunningTime="2026-02-19 19:35:39.045115074 +0000 UTC m=+1038.657465398" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.082054 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g" podStartSLOduration=11.153746498 podStartE2EDuration="31.082027345s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:09.366313989 +0000 UTC m=+1008.978664313" lastFinishedPulling="2026-02-19 19:35:29.294594836 +0000 UTC m=+1028.906945160" observedRunningTime="2026-02-19 19:35:39.07995449 +0000 UTC m=+1038.692304814" watchObservedRunningTime="2026-02-19 19:35:39.082027345 +0000 UTC m=+1038.694377669" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.138033 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" podStartSLOduration=3.935187835 podStartE2EDuration="31.13801222s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.388641647 +0000 UTC m=+1010.000991971" lastFinishedPulling="2026-02-19 19:35:37.591466032 +0000 UTC m=+1037.203816356" observedRunningTime="2026-02-19 19:35:39.130189616 +0000 UTC m=+1038.742539940" watchObservedRunningTime="2026-02-19 19:35:39.13801222 +0000 UTC m=+1038.750362544" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.139203 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" podStartSLOduration=3.869543 podStartE2EDuration="31.139196807s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.358276131 +0000 UTC m=+1009.970626455" lastFinishedPulling="2026-02-19 19:35:37.627929938 +0000 UTC m=+1037.240280262" observedRunningTime="2026-02-19 19:35:39.104543357 +0000 UTC m=+1038.716893681" watchObservedRunningTime="2026-02-19 19:35:39.139196807 +0000 UTC m=+1038.751547141" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.153856 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" podStartSLOduration=5.048998655 podStartE2EDuration="31.153838143s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.201595797 +0000 UTC m=+1009.813946121" lastFinishedPulling="2026-02-19 19:35:36.306435245 +0000 UTC m=+1035.918785609" observedRunningTime="2026-02-19 19:35:39.153588795 +0000 UTC m=+1038.765939119" watchObservedRunningTime="2026-02-19 19:35:39.153838143 +0000 UTC m=+1038.766188467" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.184145 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr" podStartSLOduration=12.6908621 podStartE2EDuration="32.184129237s" podCreationTimestamp="2026-02-19 19:35:07 +0000 UTC" firstStartedPulling="2026-02-19 19:35:09.801373851 +0000 UTC m=+1009.413724175" lastFinishedPulling="2026-02-19 19:35:29.294640968 +0000 UTC m=+1028.906991312" observedRunningTime="2026-02-19 19:35:39.179811183 +0000 UTC m=+1038.792161507" watchObservedRunningTime="2026-02-19 19:35:39.184129237 +0000 UTC m=+1038.796479561" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.976658 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" event={"ID":"b37b04c7-5374-49d3-97c0-5b5b27c4a220","Type":"ContainerStarted","Data":"eace4536b524ad6edbe4682a42e6e2710ecc4fddd0b7f739d80ca9b434c3e2af"} Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.976966 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.977826 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" event={"ID":"a6fb3554-24ea-4330-b2cb-1c91f105345d","Type":"ContainerStarted","Data":"40c844911fae397675c6c86544f97ad54db860a1ebf4432da0f6b43be2fc9a61"} Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.978201 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.979776 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" event={"ID":"f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0","Type":"ContainerStarted","Data":"a23bb12790be74459bed2f278406f1f1090f41208d9ede70355aa9867af55d13"} Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.979957 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.982897 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" event={"ID":"57783601-5230-49ef-8ac2-0ddf78bd4b3a","Type":"ContainerStarted","Data":"71ca4f26dab6902a38fc13985df5823e5d5ebc4fa251606ba1aa0ad892a413cb"} Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.983267 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.991688 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" podStartSLOduration=3.415896578 podStartE2EDuration="31.99166938s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.127169466 +0000 UTC m=+1009.739519790" lastFinishedPulling="2026-02-19 19:35:38.702942268 +0000 UTC m=+1038.315292592" observedRunningTime="2026-02-19 19:35:39.991109562 +0000 UTC m=+1039.603459896" watchObservedRunningTime="2026-02-19 19:35:39.99166938 +0000 UTC m=+1039.604019704" Feb 19 19:35:39 crc kubenswrapper[4722]: I0219 19:35:39.994000 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq" podStartSLOduration=12.832664663 podStartE2EDuration="31.993991722s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.134293779 +0000 UTC m=+1009.746644093" lastFinishedPulling="2026-02-19 19:35:29.295620828 +0000 UTC m=+1028.907971152" observedRunningTime="2026-02-19 19:35:39.202568813 +0000 UTC m=+1038.814919137" watchObservedRunningTime="2026-02-19 19:35:39.993991722 +0000 UTC m=+1039.606342046" Feb 19 19:35:40 crc kubenswrapper[4722]: I0219 19:35:40.016510 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" podStartSLOduration=3.655373883 podStartE2EDuration="32.016493444s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.184776192 +0000 UTC m=+1009.797126516" lastFinishedPulling="2026-02-19 19:35:38.545895753 +0000 UTC m=+1038.158246077" observedRunningTime="2026-02-19 19:35:40.01122497 +0000 UTC m=+1039.623575304" watchObservedRunningTime="2026-02-19 19:35:40.016493444 +0000 UTC m=+1039.628843768" Feb 19 19:35:40 crc kubenswrapper[4722]: I0219 19:35:40.043847 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" podStartSLOduration=3.3591667689999998 podStartE2EDuration="32.043826515s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.185043231 +0000 UTC m=+1009.797393555" lastFinishedPulling="2026-02-19 19:35:38.869702977 +0000 UTC m=+1038.482053301" observedRunningTime="2026-02-19 19:35:40.036692804 +0000 UTC m=+1039.649043118" watchObservedRunningTime="2026-02-19 19:35:40.043826515 +0000 UTC m=+1039.656176849" Feb 19 19:35:40 crc kubenswrapper[4722]: I0219 19:35:40.510373 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:40 crc kubenswrapper[4722]: I0219 19:35:40.520184 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8870a7b1-f894-4429-9f52-d9063fe9c780-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9csntwh\" (UID: \"8870a7b1-f894-4429-9f52-d9063fe9c780\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:40 crc kubenswrapper[4722]: I0219 19:35:40.773185 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-tmczn" Feb 19 19:35:40 crc kubenswrapper[4722]: I0219 19:35:40.781735 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:41 crc kubenswrapper[4722]: I0219 19:35:41.108878 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" podStartSLOduration=4.486920473 podStartE2EDuration="33.108837784s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:09.922844667 +0000 UTC m=+1009.535194991" lastFinishedPulling="2026-02-19 19:35:38.544761978 +0000 UTC m=+1038.157112302" observedRunningTime="2026-02-19 19:35:40.06033548 +0000 UTC m=+1039.672685794" watchObservedRunningTime="2026-02-19 19:35:41.108837784 +0000 UTC m=+1040.721188108" Feb 19 19:35:41 crc kubenswrapper[4722]: I0219 19:35:41.453630 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh"] Feb 19 19:35:41 crc kubenswrapper[4722]: W0219 19:35:41.456769 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8870a7b1_f894_4429_9f52_d9063fe9c780.slice/crio-f292d2e4230c55903ff11ab9420bdd72dd5e9a4200462482042e068818cb9e39 WatchSource:0}: Error finding container f292d2e4230c55903ff11ab9420bdd72dd5e9a4200462482042e068818cb9e39: Status 404 returned error can't find the container with id f292d2e4230c55903ff11ab9420bdd72dd5e9a4200462482042e068818cb9e39 Feb 19 19:35:41 crc kubenswrapper[4722]: I0219 19:35:41.798548 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:35:41 crc kubenswrapper[4722]: I0219 19:35:41.798605 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:35:41 crc kubenswrapper[4722]: I0219 19:35:41.798652 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:35:41 crc kubenswrapper[4722]: I0219 19:35:41.799297 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d8ceb58059028fac39dbad274e30d4a3cfc17b7b996b2c7fee64b6d0dd4a36f1"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:35:41 crc kubenswrapper[4722]: I0219 19:35:41.799360 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://d8ceb58059028fac39dbad274e30d4a3cfc17b7b996b2c7fee64b6d0dd4a36f1" gracePeriod=600 Feb 19 19:35:42 crc kubenswrapper[4722]: I0219 19:35:42.008283 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" event={"ID":"8870a7b1-f894-4429-9f52-d9063fe9c780","Type":"ContainerStarted","Data":"f292d2e4230c55903ff11ab9420bdd72dd5e9a4200462482042e068818cb9e39"} Feb 19 19:35:42 crc kubenswrapper[4722]: I0219 19:35:42.009777 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" event={"ID":"421f6539-4fcb-4949-ba29-34997fc98490","Type":"ContainerStarted","Data":"db79c9b9cbeac9eb51c0878d86de830bf4fb98b90efada8b7b4e6c5b99028afa"} Feb 19 19:35:42 crc kubenswrapper[4722]: I0219 19:35:42.009891 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:42 crc kubenswrapper[4722]: I0219 19:35:42.011759 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="d8ceb58059028fac39dbad274e30d4a3cfc17b7b996b2c7fee64b6d0dd4a36f1" exitCode=0 Feb 19 19:35:42 crc kubenswrapper[4722]: I0219 19:35:42.011786 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"d8ceb58059028fac39dbad274e30d4a3cfc17b7b996b2c7fee64b6d0dd4a36f1"} Feb 19 19:35:42 crc kubenswrapper[4722]: I0219 19:35:42.011807 4722 scope.go:117] "RemoveContainer" containerID="66078169c6e38cc91acddc273dfade3d624308d325857d7f5a0c20b40b5ebc84" Feb 19 19:35:42 crc kubenswrapper[4722]: I0219 19:35:42.026691 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" podStartSLOduration=30.99360685 podStartE2EDuration="34.026671104s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:37.920465327 +0000 UTC m=+1037.532815652" lastFinishedPulling="2026-02-19 19:35:40.953529582 +0000 UTC m=+1040.565879906" observedRunningTime="2026-02-19 19:35:42.024556808 +0000 UTC m=+1041.636907142" watchObservedRunningTime="2026-02-19 19:35:42.026671104 +0000 UTC m=+1041.639021428" Feb 19 19:35:43 crc kubenswrapper[4722]: I0219 19:35:43.023146 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"3f9ea5233c8da68a82202932b76beffc960ff77ead8fdc47e6fb7d01f484e9a5"} Feb 19 19:35:43 crc kubenswrapper[4722]: I0219 19:35:43.026408 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" event={"ID":"c36983b4-b7f9-4834-85e9-a5c3cb83eb2d","Type":"ContainerStarted","Data":"d6201fb908c7d195c433333c870815690f49158d0e297e9249bae28ad2dcc2a9"} Feb 19 19:35:43 crc kubenswrapper[4722]: I0219 19:35:43.026730 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" Feb 19 19:35:43 crc kubenswrapper[4722]: I0219 19:35:43.029430 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" event={"ID":"64ff9a64-f79f-4a45-943d-36152964cfcd","Type":"ContainerStarted","Data":"7f19d3f0b25f57ab1510a777e2d1e7208095b102be91033f9de906954cdfc74c"} Feb 19 19:35:43 crc kubenswrapper[4722]: I0219 19:35:43.029795 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" Feb 19 19:35:43 crc kubenswrapper[4722]: I0219 19:35:43.068061 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" podStartSLOduration=2.8579883170000002 podStartE2EDuration="35.068039175s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.328409849 +0000 UTC m=+1009.940760183" lastFinishedPulling="2026-02-19 19:35:42.538460717 +0000 UTC m=+1042.150811041" observedRunningTime="2026-02-19 19:35:43.063033629 +0000 UTC m=+1042.675383963" watchObservedRunningTime="2026-02-19 19:35:43.068039175 +0000 UTC m=+1042.680389499" Feb 19 19:35:43 crc kubenswrapper[4722]: I0219 19:35:43.084645 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" podStartSLOduration=1.895200435 podStartE2EDuration="35.084626432s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:09.462382934 +0000 UTC m=+1009.074733258" lastFinishedPulling="2026-02-19 19:35:42.651808921 +0000 UTC m=+1042.264159255" observedRunningTime="2026-02-19 19:35:43.080796882 +0000 UTC m=+1042.693147216" watchObservedRunningTime="2026-02-19 19:35:43.084626432 +0000 UTC m=+1042.696976766" Feb 19 19:35:44 crc kubenswrapper[4722]: I0219 19:35:44.079095 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" event={"ID":"8870a7b1-f894-4429-9f52-d9063fe9c780","Type":"ContainerStarted","Data":"963486cba8459481d456b56600ea7f4d785af66779ff1abf7d31cb976a76d52e"} Feb 19 19:35:44 crc kubenswrapper[4722]: I0219 19:35:44.079464 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:44 crc kubenswrapper[4722]: I0219 19:35:44.107226 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d" event={"ID":"65b17979-6c94-40e6-ac54-41a61a726e87","Type":"ContainerStarted","Data":"c20e6e7717a98848bf56f29ac034b7985f710e16b337af1496d189a7c5c984c3"} Feb 19 19:35:44 crc kubenswrapper[4722]: I0219 19:35:44.153618 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pjv7d" podStartSLOduration=2.71085826 podStartE2EDuration="36.153599184s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.129708465 +0000 UTC m=+1009.742058789" lastFinishedPulling="2026-02-19 19:35:43.572449369 +0000 UTC m=+1043.184799713" observedRunningTime="2026-02-19 19:35:44.149552168 +0000 UTC m=+1043.761902492" watchObservedRunningTime="2026-02-19 19:35:44.153599184 +0000 UTC m=+1043.765949508" Feb 19 19:35:44 crc kubenswrapper[4722]: I0219 19:35:44.215144 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" podStartSLOduration=34.101983912 podStartE2EDuration="36.215126642s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:41.458645218 +0000 UTC m=+1041.070995542" lastFinishedPulling="2026-02-19 19:35:43.571787948 +0000 UTC m=+1043.184138272" observedRunningTime="2026-02-19 19:35:44.199565127 +0000 UTC m=+1043.811915451" watchObservedRunningTime="2026-02-19 19:35:44.215126642 +0000 UTC m=+1043.827476966" Feb 19 19:35:44 crc kubenswrapper[4722]: I0219 19:35:44.789603 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5f8cf67456-vwhlj" Feb 19 19:35:45 crc kubenswrapper[4722]: I0219 19:35:45.114905 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" event={"ID":"766eebc1-05fc-4ca0-8c75-276632a6597e","Type":"ContainerStarted","Data":"488158f43ee85ef4bd2e7550885b354719b06bfeefba9b5443847a65185aeafd"} Feb 19 19:35:45 crc kubenswrapper[4722]: I0219 19:35:45.115307 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" Feb 19 19:35:45 crc kubenswrapper[4722]: I0219 19:35:45.134487 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" podStartSLOduration=2.677282515 podStartE2EDuration="37.13447121s" podCreationTimestamp="2026-02-19 19:35:08 +0000 UTC" firstStartedPulling="2026-02-19 19:35:10.143325621 +0000 UTC m=+1009.755675945" lastFinishedPulling="2026-02-19 19:35:44.600514316 +0000 UTC m=+1044.212864640" observedRunningTime="2026-02-19 19:35:45.129121053 +0000 UTC m=+1044.741471377" watchObservedRunningTime="2026-02-19 19:35:45.13447121 +0000 UTC m=+1044.746821524" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.395529 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-mc64t" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.412801 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-qrsw8" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.441554 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-hxv5g" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.522446 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-rnh9h" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.599098 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-x6wk7" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.632500 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5c54" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.647330 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-x7bwr" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.667669 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6t7g6" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.755332 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-hncxm" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.802708 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-8cljg" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.817421 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-wqp5t" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.856852 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-zft4s" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.868727 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-6dlqc" Feb 19 19:35:48 crc kubenswrapper[4722]: I0219 19:35:48.996280 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mgzgq" Feb 19 19:35:49 crc kubenswrapper[4722]: I0219 19:35:49.028851 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-wktqn" Feb 19 19:35:49 crc kubenswrapper[4722]: I0219 19:35:49.079305 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5484b6858b-7g48c" Feb 19 19:35:49 crc kubenswrapper[4722]: I0219 19:35:49.103732 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-dbdmf" Feb 19 19:35:49 crc kubenswrapper[4722]: I0219 19:35:49.141589 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-zdfxj" Feb 19 19:35:50 crc kubenswrapper[4722]: I0219 19:35:50.791075 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9csntwh" Feb 19 19:35:54 crc kubenswrapper[4722]: I0219 19:35:54.432523 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-q5kgj" Feb 19 19:35:58 crc kubenswrapper[4722]: I0219 19:35:58.621379 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-7qkx4" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.906678 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nngml"] Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.908305 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.910600 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.910777 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.910651 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-kvsd7" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.910719 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.919743 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nngml"] Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.969953 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9t97w"] Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.971068 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.973579 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.986665 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9t97w"] Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.995141 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8pmh\" (UniqueName: \"kubernetes.io/projected/af3a7297-2590-4a47-baa0-cd5b6029b6a4-kube-api-access-r8pmh\") pod \"dnsmasq-dns-675f4bcbfc-nngml\" (UID: \"af3a7297-2590-4a47-baa0-cd5b6029b6a4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.995225 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9t97w\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.995271 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcgth\" (UniqueName: \"kubernetes.io/projected/7421fc0e-3cfa-49af-a4c0-90807314bb61-kube-api-access-hcgth\") pod \"dnsmasq-dns-78dd6ddcc-9t97w\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.995330 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3a7297-2590-4a47-baa0-cd5b6029b6a4-config\") pod \"dnsmasq-dns-675f4bcbfc-nngml\" (UID: \"af3a7297-2590-4a47-baa0-cd5b6029b6a4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" Feb 19 19:36:19 crc kubenswrapper[4722]: I0219 19:36:19.995375 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-config\") pod \"dnsmasq-dns-78dd6ddcc-9t97w\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.096698 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9t97w\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.096867 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcgth\" (UniqueName: \"kubernetes.io/projected/7421fc0e-3cfa-49af-a4c0-90807314bb61-kube-api-access-hcgth\") pod \"dnsmasq-dns-78dd6ddcc-9t97w\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.097161 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3a7297-2590-4a47-baa0-cd5b6029b6a4-config\") pod \"dnsmasq-dns-675f4bcbfc-nngml\" (UID: \"af3a7297-2590-4a47-baa0-cd5b6029b6a4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.097216 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-config\") pod \"dnsmasq-dns-78dd6ddcc-9t97w\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.097285 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8pmh\" (UniqueName: \"kubernetes.io/projected/af3a7297-2590-4a47-baa0-cd5b6029b6a4-kube-api-access-r8pmh\") pod \"dnsmasq-dns-675f4bcbfc-nngml\" (UID: \"af3a7297-2590-4a47-baa0-cd5b6029b6a4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.098133 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3a7297-2590-4a47-baa0-cd5b6029b6a4-config\") pod \"dnsmasq-dns-675f4bcbfc-nngml\" (UID: \"af3a7297-2590-4a47-baa0-cd5b6029b6a4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.098470 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-config\") pod \"dnsmasq-dns-78dd6ddcc-9t97w\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.098964 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-9t97w\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.118879 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8pmh\" (UniqueName: \"kubernetes.io/projected/af3a7297-2590-4a47-baa0-cd5b6029b6a4-kube-api-access-r8pmh\") pod \"dnsmasq-dns-675f4bcbfc-nngml\" (UID: \"af3a7297-2590-4a47-baa0-cd5b6029b6a4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.120193 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcgth\" (UniqueName: \"kubernetes.io/projected/7421fc0e-3cfa-49af-a4c0-90807314bb61-kube-api-access-hcgth\") pod \"dnsmasq-dns-78dd6ddcc-9t97w\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.232660 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.284822 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.658701 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nngml"] Feb 19 19:36:20 crc kubenswrapper[4722]: I0219 19:36:20.754647 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9t97w"] Feb 19 19:36:20 crc kubenswrapper[4722]: W0219 19:36:20.764611 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7421fc0e_3cfa_49af_a4c0_90807314bb61.slice/crio-9c8b478ee26b023117fcef2ff288db2274a47d09a770d88e80127097d4da0163 WatchSource:0}: Error finding container 9c8b478ee26b023117fcef2ff288db2274a47d09a770d88e80127097d4da0163: Status 404 returned error can't find the container with id 9c8b478ee26b023117fcef2ff288db2274a47d09a770d88e80127097d4da0163 Feb 19 19:36:21 crc kubenswrapper[4722]: I0219 19:36:21.408508 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" event={"ID":"7421fc0e-3cfa-49af-a4c0-90807314bb61","Type":"ContainerStarted","Data":"9c8b478ee26b023117fcef2ff288db2274a47d09a770d88e80127097d4da0163"} Feb 19 19:36:21 crc kubenswrapper[4722]: I0219 19:36:21.410375 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" event={"ID":"af3a7297-2590-4a47-baa0-cd5b6029b6a4","Type":"ContainerStarted","Data":"bfeedc6c93dfc0ff6029b23856b61146b86096c4ced635aad0f319bef89af5a1"} Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.596017 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nngml"] Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.627488 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dt86l"] Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.628848 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.635888 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-dns-svc\") pod \"dnsmasq-dns-666b6646f7-dt86l\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.635992 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-config\") pod \"dnsmasq-dns-666b6646f7-dt86l\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.636015 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf2km\" (UniqueName: \"kubernetes.io/projected/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-kube-api-access-kf2km\") pod \"dnsmasq-dns-666b6646f7-dt86l\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.653373 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dt86l"] Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.737303 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-dns-svc\") pod \"dnsmasq-dns-666b6646f7-dt86l\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.737374 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf2km\" (UniqueName: \"kubernetes.io/projected/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-kube-api-access-kf2km\") pod \"dnsmasq-dns-666b6646f7-dt86l\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.737392 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-config\") pod \"dnsmasq-dns-666b6646f7-dt86l\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.738310 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-dns-svc\") pod \"dnsmasq-dns-666b6646f7-dt86l\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.738317 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-config\") pod \"dnsmasq-dns-666b6646f7-dt86l\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.770646 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf2km\" (UniqueName: \"kubernetes.io/projected/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-kube-api-access-kf2km\") pod \"dnsmasq-dns-666b6646f7-dt86l\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.945590 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9t97w"] Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.953694 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.970794 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jmsp2"] Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.972337 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:22 crc kubenswrapper[4722]: I0219 19:36:22.992284 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jmsp2"] Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.142450 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpz6x\" (UniqueName: \"kubernetes.io/projected/17b6c8b5-9711-4601-a0fd-a1f528e97287-kube-api-access-wpz6x\") pod \"dnsmasq-dns-57d769cc4f-jmsp2\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.145457 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jmsp2\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.146710 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-config\") pod \"dnsmasq-dns-57d769cc4f-jmsp2\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.248427 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpz6x\" (UniqueName: \"kubernetes.io/projected/17b6c8b5-9711-4601-a0fd-a1f528e97287-kube-api-access-wpz6x\") pod \"dnsmasq-dns-57d769cc4f-jmsp2\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.248473 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jmsp2\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.248578 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-config\") pod \"dnsmasq-dns-57d769cc4f-jmsp2\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.249365 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-config\") pod \"dnsmasq-dns-57d769cc4f-jmsp2\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.250118 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jmsp2\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.272581 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpz6x\" (UniqueName: \"kubernetes.io/projected/17b6c8b5-9711-4601-a0fd-a1f528e97287-kube-api-access-wpz6x\") pod \"dnsmasq-dns-57d769cc4f-jmsp2\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.352132 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.563524 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dt86l"] Feb 19 19:36:23 crc kubenswrapper[4722]: W0219 19:36:23.571170 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d3acaf5_2a64_4dbd_8a74_f4e10cbd5465.slice/crio-f69d565a432964650b0242acf131b696304e0220fdb9a6380a634faf46d56f00 WatchSource:0}: Error finding container f69d565a432964650b0242acf131b696304e0220fdb9a6380a634faf46d56f00: Status 404 returned error can't find the container with id f69d565a432964650b0242acf131b696304e0220fdb9a6380a634faf46d56f00 Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.794392 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.796078 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.798820 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.799105 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.799201 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-cbm8q" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.799317 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.799434 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.799587 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.800131 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.806754 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.871914 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jmsp2"] Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957344 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957382 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957404 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56t8r\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-kube-api-access-56t8r\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957424 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957516 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957564 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957595 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957646 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957669 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957703 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:23 crc kubenswrapper[4722]: I0219 19:36:23.957780 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-config-data\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060040 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-config-data\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060132 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060191 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060214 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56t8r\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-kube-api-access-56t8r\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060231 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060250 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060268 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060292 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060317 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060379 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.060412 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.061575 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.063879 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.064976 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.065399 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.065778 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.066569 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.067325 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.067356 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5f6cee635ca5e2d348cf915d62a0dac8d2194b66bba55200fe901088eac3f7dd/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.067323 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.080798 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.084724 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56t8r\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-kube-api-access-56t8r\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.086955 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-config-data\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.101479 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") pod \"rabbitmq-server-0\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.124507 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.127520 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.132677 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.136791 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.132549 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.137657 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.139510 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.139544 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.139552 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.139760 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-qdf2m" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.146778 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.271450 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.271887 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.271957 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.272029 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.272073 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.272099 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.272166 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.272219 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.272242 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5nxh\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-kube-api-access-k5nxh\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.272274 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a5fb8482-d574-4930-864c-175c2bedef51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.272296 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373347 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373413 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373458 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373487 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373514 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5nxh\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-kube-api-access-k5nxh\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373547 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a5fb8482-d574-4930-864c-175c2bedef51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373571 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373619 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373675 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.373735 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.376601 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.376888 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.377442 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.377615 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.377870 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.393307 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.393412 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.393506 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.393884 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.393910 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a5fb8482-d574-4930-864c-175c2bedef51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6408a1f41ebba08884844654cc07aafa4a02aa7486293e45dd19f823f7662d43/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.393957 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.406821 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5nxh\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-kube-api-access-k5nxh\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.474352 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a5fb8482-d574-4930-864c-175c2bedef51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") pod \"rabbitmq-cell1-server-0\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.514457 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" event={"ID":"17b6c8b5-9711-4601-a0fd-a1f528e97287","Type":"ContainerStarted","Data":"967b9f32e64a62fdd1e64949dcea547f81e3dffcf01eb6300e42284f5721c31d"} Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.515430 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" event={"ID":"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465","Type":"ContainerStarted","Data":"f69d565a432964650b0242acf131b696304e0220fdb9a6380a634faf46d56f00"} Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.767669 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:36:24 crc kubenswrapper[4722]: I0219 19:36:24.811003 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.330855 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.422144 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.423561 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.435463 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.436052 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.436274 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.436426 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.436559 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-98pfc" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.436762 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.537458 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45","Type":"ContainerStarted","Data":"24279a6d2caf7ad4b1f181fa89124ed3ff752cfc1180df75df7a96c88d0345e2"} Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.620188 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53444e7f-4c1d-401b-9896-5ff9c4aab65a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.620239 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/53444e7f-4c1d-401b-9896-5ff9c4aab65a-config-data-default\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.620291 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/53444e7f-4c1d-401b-9896-5ff9c4aab65a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.620327 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/53444e7f-4c1d-401b-9896-5ff9c4aab65a-kolla-config\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.620358 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/53444e7f-4c1d-401b-9896-5ff9c4aab65a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.620390 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82cjh\" (UniqueName: \"kubernetes.io/projected/53444e7f-4c1d-401b-9896-5ff9c4aab65a-kube-api-access-82cjh\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.620415 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-13297c0c-d6c3-4eb0-965c-07fc337cd174\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13297c0c-d6c3-4eb0-965c-07fc337cd174\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.620436 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53444e7f-4c1d-401b-9896-5ff9c4aab65a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.725262 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53444e7f-4c1d-401b-9896-5ff9c4aab65a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.725340 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/53444e7f-4c1d-401b-9896-5ff9c4aab65a-config-data-default\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.725375 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/53444e7f-4c1d-401b-9896-5ff9c4aab65a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.725422 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/53444e7f-4c1d-401b-9896-5ff9c4aab65a-kolla-config\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.725462 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/53444e7f-4c1d-401b-9896-5ff9c4aab65a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.725510 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82cjh\" (UniqueName: \"kubernetes.io/projected/53444e7f-4c1d-401b-9896-5ff9c4aab65a-kube-api-access-82cjh\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.725549 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-13297c0c-d6c3-4eb0-965c-07fc337cd174\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13297c0c-d6c3-4eb0-965c-07fc337cd174\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.725648 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53444e7f-4c1d-401b-9896-5ff9c4aab65a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.727978 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53444e7f-4c1d-401b-9896-5ff9c4aab65a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.728023 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/53444e7f-4c1d-401b-9896-5ff9c4aab65a-config-data-default\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.729379 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/53444e7f-4c1d-401b-9896-5ff9c4aab65a-kolla-config\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.731134 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/53444e7f-4c1d-401b-9896-5ff9c4aab65a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.761944 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/53444e7f-4c1d-401b-9896-5ff9c4aab65a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.762093 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53444e7f-4c1d-401b-9896-5ff9c4aab65a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.762442 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.762486 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-13297c0c-d6c3-4eb0-965c-07fc337cd174\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13297c0c-d6c3-4eb0-965c-07fc337cd174\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6c8f62df97f865453863e26b12aa68d2572f80e4101124fe07995cb8bbe4bb98/globalmount\"" pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.766756 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82cjh\" (UniqueName: \"kubernetes.io/projected/53444e7f-4c1d-401b-9896-5ff9c4aab65a-kube-api-access-82cjh\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:25 crc kubenswrapper[4722]: I0219 19:36:25.815344 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-13297c0c-d6c3-4eb0-965c-07fc337cd174\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13297c0c-d6c3-4eb0-965c-07fc337cd174\") pod \"openstack-galera-0\" (UID: \"53444e7f-4c1d-401b-9896-5ff9c4aab65a\") " pod="openstack/openstack-galera-0" Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.062090 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.853706 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.855941 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.862373 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.862797 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-8hfxd" Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.865567 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.865792 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.877534 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.925136 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.926414 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.935993 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.936462 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.936563 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 19 19:36:26 crc kubenswrapper[4722]: I0219 19:36:26.936730 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-r9hpl" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.053777 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-memcached-tls-certs\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.053827 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-config-data\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.053851 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-804d4dc6-23d8-464a-9f60-fb4e6dbbd35c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-804d4dc6-23d8-464a-9f60-fb4e6dbbd35c\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.053868 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a07f9633-74f5-48e5-8467-d649fc49a2ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.053898 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-kolla-config\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.053915 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a07f9633-74f5-48e5-8467-d649fc49a2ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.053935 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4gzc\" (UniqueName: \"kubernetes.io/projected/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-kube-api-access-f4gzc\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.053952 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07f9633-74f5-48e5-8467-d649fc49a2ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.053968 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-combined-ca-bundle\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.054031 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a07f9633-74f5-48e5-8467-d649fc49a2ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.054069 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07f9633-74f5-48e5-8467-d649fc49a2ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.054101 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twqwb\" (UniqueName: \"kubernetes.io/projected/a07f9633-74f5-48e5-8467-d649fc49a2ff-kube-api-access-twqwb\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.054132 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a07f9633-74f5-48e5-8467-d649fc49a2ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155570 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a07f9633-74f5-48e5-8467-d649fc49a2ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155658 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-memcached-tls-certs\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155679 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-config-data\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155705 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-804d4dc6-23d8-464a-9f60-fb4e6dbbd35c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-804d4dc6-23d8-464a-9f60-fb4e6dbbd35c\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155727 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a07f9633-74f5-48e5-8467-d649fc49a2ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155756 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-kolla-config\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155777 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a07f9633-74f5-48e5-8467-d649fc49a2ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155798 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4gzc\" (UniqueName: \"kubernetes.io/projected/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-kube-api-access-f4gzc\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155815 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07f9633-74f5-48e5-8467-d649fc49a2ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155833 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-combined-ca-bundle\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155852 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a07f9633-74f5-48e5-8467-d649fc49a2ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155889 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07f9633-74f5-48e5-8467-d649fc49a2ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.155929 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twqwb\" (UniqueName: \"kubernetes.io/projected/a07f9633-74f5-48e5-8467-d649fc49a2ff-kube-api-access-twqwb\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.157084 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-kolla-config\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.157339 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a07f9633-74f5-48e5-8467-d649fc49a2ff-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.158140 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a07f9633-74f5-48e5-8467-d649fc49a2ff-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.158762 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a07f9633-74f5-48e5-8467-d649fc49a2ff-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.158807 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-config-data\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.160338 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a07f9633-74f5-48e5-8467-d649fc49a2ff-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.164751 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07f9633-74f5-48e5-8467-d649fc49a2ff-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.166538 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-memcached-tls-certs\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.176629 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-combined-ca-bundle\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.181098 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4gzc\" (UniqueName: \"kubernetes.io/projected/059950bd-4e60-42e6-a9c6-4e4ab0b039aa-kube-api-access-f4gzc\") pod \"memcached-0\" (UID: \"059950bd-4e60-42e6-a9c6-4e4ab0b039aa\") " pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.181278 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twqwb\" (UniqueName: \"kubernetes.io/projected/a07f9633-74f5-48e5-8467-d649fc49a2ff-kube-api-access-twqwb\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.181755 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07f9633-74f5-48e5-8467-d649fc49a2ff-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.184109 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.189352 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-804d4dc6-23d8-464a-9f60-fb4e6dbbd35c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-804d4dc6-23d8-464a-9f60-fb4e6dbbd35c\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/89de2f74d00e86341d558d2e9eae6b444d9d706847f448e590c53d8ab50a529c/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.228508 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-804d4dc6-23d8-464a-9f60-fb4e6dbbd35c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-804d4dc6-23d8-464a-9f60-fb4e6dbbd35c\") pod \"openstack-cell1-galera-0\" (UID: \"a07f9633-74f5-48e5-8467-d649fc49a2ff\") " pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.248174 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 19:36:27 crc kubenswrapper[4722]: I0219 19:36:27.478689 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 19:36:28 crc kubenswrapper[4722]: I0219 19:36:28.987126 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 19:36:28 crc kubenswrapper[4722]: I0219 19:36:28.988340 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 19:36:28 crc kubenswrapper[4722]: I0219 19:36:28.991085 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-cf9vh" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.000128 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.098434 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c6z8\" (UniqueName: \"kubernetes.io/projected/14a7aae0-6a51-49ed-b4dd-9b274885d1da-kube-api-access-8c6z8\") pod \"kube-state-metrics-0\" (UID: \"14a7aae0-6a51-49ed-b4dd-9b274885d1da\") " pod="openstack/kube-state-metrics-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.200314 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c6z8\" (UniqueName: \"kubernetes.io/projected/14a7aae0-6a51-49ed-b4dd-9b274885d1da-kube-api-access-8c6z8\") pod \"kube-state-metrics-0\" (UID: \"14a7aae0-6a51-49ed-b4dd-9b274885d1da\") " pod="openstack/kube-state-metrics-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.245935 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c6z8\" (UniqueName: \"kubernetes.io/projected/14a7aae0-6a51-49ed-b4dd-9b274885d1da-kube-api-access-8c6z8\") pod \"kube-state-metrics-0\" (UID: \"14a7aae0-6a51-49ed-b4dd-9b274885d1da\") " pod="openstack/kube-state-metrics-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.364064 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.640445 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.641918 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.643637 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.643813 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-k49sx" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.643960 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.643979 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.644004 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.655490 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.808942 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/78e7f1b1-be76-4f05-bd63-ff87b440e173-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.809088 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/78e7f1b1-be76-4f05-bd63-ff87b440e173-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.809124 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/78e7f1b1-be76-4f05-bd63-ff87b440e173-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.809237 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxcnq\" (UniqueName: \"kubernetes.io/projected/78e7f1b1-be76-4f05-bd63-ff87b440e173-kube-api-access-mxcnq\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.809345 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/78e7f1b1-be76-4f05-bd63-ff87b440e173-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.809422 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/78e7f1b1-be76-4f05-bd63-ff87b440e173-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.809451 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/78e7f1b1-be76-4f05-bd63-ff87b440e173-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.910898 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxcnq\" (UniqueName: \"kubernetes.io/projected/78e7f1b1-be76-4f05-bd63-ff87b440e173-kube-api-access-mxcnq\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.911016 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/78e7f1b1-be76-4f05-bd63-ff87b440e173-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.911072 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/78e7f1b1-be76-4f05-bd63-ff87b440e173-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.911100 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/78e7f1b1-be76-4f05-bd63-ff87b440e173-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.911136 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/78e7f1b1-be76-4f05-bd63-ff87b440e173-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.911224 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/78e7f1b1-be76-4f05-bd63-ff87b440e173-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.911259 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/78e7f1b1-be76-4f05-bd63-ff87b440e173-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.912025 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/78e7f1b1-be76-4f05-bd63-ff87b440e173-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.918350 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/78e7f1b1-be76-4f05-bd63-ff87b440e173-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.923717 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/78e7f1b1-be76-4f05-bd63-ff87b440e173-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.924569 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/78e7f1b1-be76-4f05-bd63-ff87b440e173-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.924577 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/78e7f1b1-be76-4f05-bd63-ff87b440e173-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.928617 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/78e7f1b1-be76-4f05-bd63-ff87b440e173-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.938748 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxcnq\" (UniqueName: \"kubernetes.io/projected/78e7f1b1-be76-4f05-bd63-ff87b440e173-kube-api-access-mxcnq\") pod \"alertmanager-metric-storage-0\" (UID: \"78e7f1b1-be76-4f05-bd63-ff87b440e173\") " pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:29 crc kubenswrapper[4722]: I0219 19:36:29.960505 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.316974 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.319434 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.322523 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.322809 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.323014 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-kl9sq" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.325047 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.325047 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.325419 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.325592 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.325832 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.334924 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.519724 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrjf5\" (UniqueName: \"kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-kube-api-access-nrjf5\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.519849 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.519939 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.519974 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.520033 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.520143 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.520218 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.520266 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.520299 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.520351 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.621495 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.621561 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.621588 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.621615 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.621656 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.621722 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrjf5\" (UniqueName: \"kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-kube-api-access-nrjf5\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.621753 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.621781 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.621802 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.621836 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.622486 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.622569 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.622689 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.625396 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.625664 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.630686 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.630728 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/991d7114ade43d3df67520db88811056b16c48c5086e58d4724863cd9821be9f/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.630752 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.630766 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.640501 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrjf5\" (UniqueName: \"kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-kube-api-access-nrjf5\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.647791 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.665176 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") pod \"prometheus-metric-storage-0\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:30 crc kubenswrapper[4722]: I0219 19:36:30.942619 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.123001 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6tmmr"] Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.132423 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.135048 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-fwvrs"] Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.136848 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.138638 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-mm6tk" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.138903 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.139118 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.145584 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6tmmr"] Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.182353 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-fwvrs"] Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.213653 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-var-lib\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.214274 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c8300e35-4c72-4398-9058-0aa76005d576-scripts\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.214351 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/293cde43-7bcf-4638-a080-badb26c81138-var-log-ovn\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.214396 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-var-log\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.214772 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/293cde43-7bcf-4638-a080-badb26c81138-var-run-ovn\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.214843 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/293cde43-7bcf-4638-a080-badb26c81138-ovn-controller-tls-certs\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.215064 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55j87\" (UniqueName: \"kubernetes.io/projected/c8300e35-4c72-4398-9058-0aa76005d576-kube-api-access-55j87\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.215174 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/293cde43-7bcf-4638-a080-badb26c81138-combined-ca-bundle\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.215834 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hrdq\" (UniqueName: \"kubernetes.io/projected/293cde43-7bcf-4638-a080-badb26c81138-kube-api-access-8hrdq\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.215917 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-etc-ovs\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.216125 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-var-run\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.216193 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/293cde43-7bcf-4638-a080-badb26c81138-scripts\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.216916 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/293cde43-7bcf-4638-a080-badb26c81138-var-run\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.268832 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.270292 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.274577 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.274809 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.274946 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.275117 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.276103 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.281896 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-kqwrn" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.318751 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55j87\" (UniqueName: \"kubernetes.io/projected/c8300e35-4c72-4398-9058-0aa76005d576-kube-api-access-55j87\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.318821 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/293cde43-7bcf-4638-a080-badb26c81138-combined-ca-bundle\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.318860 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hrdq\" (UniqueName: \"kubernetes.io/projected/293cde43-7bcf-4638-a080-badb26c81138-kube-api-access-8hrdq\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.318896 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-etc-ovs\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.318931 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-var-run\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.318952 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/293cde43-7bcf-4638-a080-badb26c81138-scripts\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.318980 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/293cde43-7bcf-4638-a080-badb26c81138-var-run\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.319019 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-var-lib\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.319094 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c8300e35-4c72-4398-9058-0aa76005d576-scripts\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.319121 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/293cde43-7bcf-4638-a080-badb26c81138-var-log-ovn\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.319141 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-var-log\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.319196 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/293cde43-7bcf-4638-a080-badb26c81138-var-run-ovn\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.319223 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/293cde43-7bcf-4638-a080-badb26c81138-ovn-controller-tls-certs\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.321328 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/293cde43-7bcf-4638-a080-badb26c81138-scripts\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.321950 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/293cde43-7bcf-4638-a080-badb26c81138-var-run\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.321993 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c8300e35-4c72-4398-9058-0aa76005d576-scripts\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.322035 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-var-log\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.322065 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/293cde43-7bcf-4638-a080-badb26c81138-var-log-ovn\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.322112 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-var-run\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.322119 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/293cde43-7bcf-4638-a080-badb26c81138-var-run-ovn\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.322192 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-var-lib\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.322255 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/c8300e35-4c72-4398-9058-0aa76005d576-etc-ovs\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.328085 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/293cde43-7bcf-4638-a080-badb26c81138-ovn-controller-tls-certs\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.328095 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/293cde43-7bcf-4638-a080-badb26c81138-combined-ca-bundle\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.342885 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55j87\" (UniqueName: \"kubernetes.io/projected/c8300e35-4c72-4398-9058-0aa76005d576-kube-api-access-55j87\") pod \"ovn-controller-ovs-fwvrs\" (UID: \"c8300e35-4c72-4398-9058-0aa76005d576\") " pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.346892 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hrdq\" (UniqueName: \"kubernetes.io/projected/293cde43-7bcf-4638-a080-badb26c81138-kube-api-access-8hrdq\") pod \"ovn-controller-6tmmr\" (UID: \"293cde43-7bcf-4638-a080-badb26c81138\") " pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.420104 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/13228713-9349-4241-b1f7-67f9a2c705fa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.420248 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13228713-9349-4241-b1f7-67f9a2c705fa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.420279 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/13228713-9349-4241-b1f7-67f9a2c705fa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.420312 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13228713-9349-4241-b1f7-67f9a2c705fa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.420356 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13228713-9349-4241-b1f7-67f9a2c705fa-config\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.420390 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr9hv\" (UniqueName: \"kubernetes.io/projected/13228713-9349-4241-b1f7-67f9a2c705fa-kube-api-access-nr9hv\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.420410 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/13228713-9349-4241-b1f7-67f9a2c705fa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.420641 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-56281ae9-de37-4fa0-a83d-9703e956a545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56281ae9-de37-4fa0-a83d-9703e956a545\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.459964 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6tmmr" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.481727 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.522162 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/13228713-9349-4241-b1f7-67f9a2c705fa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.522220 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13228713-9349-4241-b1f7-67f9a2c705fa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.522246 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13228713-9349-4241-b1f7-67f9a2c705fa-config\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.522276 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr9hv\" (UniqueName: \"kubernetes.io/projected/13228713-9349-4241-b1f7-67f9a2c705fa-kube-api-access-nr9hv\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.522293 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/13228713-9349-4241-b1f7-67f9a2c705fa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.522360 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-56281ae9-de37-4fa0-a83d-9703e956a545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56281ae9-de37-4fa0-a83d-9703e956a545\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.522380 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/13228713-9349-4241-b1f7-67f9a2c705fa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.522413 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13228713-9349-4241-b1f7-67f9a2c705fa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.522993 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/13228713-9349-4241-b1f7-67f9a2c705fa-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.523860 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13228713-9349-4241-b1f7-67f9a2c705fa-config\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.524007 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13228713-9349-4241-b1f7-67f9a2c705fa-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.527110 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/13228713-9349-4241-b1f7-67f9a2c705fa-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.527130 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13228713-9349-4241-b1f7-67f9a2c705fa-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.527776 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/13228713-9349-4241-b1f7-67f9a2c705fa-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.532827 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.532876 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-56281ae9-de37-4fa0-a83d-9703e956a545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56281ae9-de37-4fa0-a83d-9703e956a545\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/98f583678ca9c1c12773c53e7f05cca773d292af92e8eebf2b9af8f5c6e51d46/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.546182 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr9hv\" (UniqueName: \"kubernetes.io/projected/13228713-9349-4241-b1f7-67f9a2c705fa-kube-api-access-nr9hv\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.569371 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-56281ae9-de37-4fa0-a83d-9703e956a545\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-56281ae9-de37-4fa0-a83d-9703e956a545\") pod \"ovsdbserver-nb-0\" (UID: \"13228713-9349-4241-b1f7-67f9a2c705fa\") " pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.598607 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 19:36:33 crc kubenswrapper[4722]: I0219 19:36:33.737992 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f","Type":"ContainerStarted","Data":"3a2845abf856d9cafaeec46534beacb5f3f1990d5bed57b69cf295f8fe01e4f1"} Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.535773 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.537589 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.539909 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-m656n" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.541523 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.542257 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.547993 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.559384 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.714381 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05a27e5a-189e-4d17-9823-d95ef7906a7b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.714772 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05a27e5a-189e-4d17-9823-d95ef7906a7b-config\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.714929 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a27e5a-189e-4d17-9823-d95ef7906a7b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.715104 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a27e5a-189e-4d17-9823-d95ef7906a7b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.715241 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a27e5a-189e-4d17-9823-d95ef7906a7b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.715348 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdvv9\" (UniqueName: \"kubernetes.io/projected/05a27e5a-189e-4d17-9823-d95ef7906a7b-kube-api-access-qdvv9\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.715454 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05a27e5a-189e-4d17-9823-d95ef7906a7b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.715573 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b900ef33-d86b-4f9b-962f-c5549ee6a730\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b900ef33-d86b-4f9b-962f-c5549ee6a730\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.817836 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a27e5a-189e-4d17-9823-d95ef7906a7b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.817874 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a27e5a-189e-4d17-9823-d95ef7906a7b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.817895 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdvv9\" (UniqueName: \"kubernetes.io/projected/05a27e5a-189e-4d17-9823-d95ef7906a7b-kube-api-access-qdvv9\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.817919 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05a27e5a-189e-4d17-9823-d95ef7906a7b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.817951 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b900ef33-d86b-4f9b-962f-c5549ee6a730\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b900ef33-d86b-4f9b-962f-c5549ee6a730\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.817973 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05a27e5a-189e-4d17-9823-d95ef7906a7b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.818021 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05a27e5a-189e-4d17-9823-d95ef7906a7b-config\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.818039 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a27e5a-189e-4d17-9823-d95ef7906a7b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.818939 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/05a27e5a-189e-4d17-9823-d95ef7906a7b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.819646 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05a27e5a-189e-4d17-9823-d95ef7906a7b-config\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.820056 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05a27e5a-189e-4d17-9823-d95ef7906a7b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.823986 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a27e5a-189e-4d17-9823-d95ef7906a7b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.824028 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a27e5a-189e-4d17-9823-d95ef7906a7b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.824253 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.824284 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b900ef33-d86b-4f9b-962f-c5549ee6a730\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b900ef33-d86b-4f9b-962f-c5549ee6a730\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ac2df717fdc68bee41330e6395ff3fb3a7398b56e12dd8c849b5929a74aef50f/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.825084 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/05a27e5a-189e-4d17-9823-d95ef7906a7b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.842103 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdvv9\" (UniqueName: \"kubernetes.io/projected/05a27e5a-189e-4d17-9823-d95ef7906a7b-kube-api-access-qdvv9\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:37 crc kubenswrapper[4722]: I0219 19:36:37.863098 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b900ef33-d86b-4f9b-962f-c5549ee6a730\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b900ef33-d86b-4f9b-962f-c5549ee6a730\") pod \"ovsdbserver-sb-0\" (UID: \"05a27e5a-189e-4d17-9823-d95ef7906a7b\") " pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:38 crc kubenswrapper[4722]: I0219 19:36:38.157440 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.153414 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c"] Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.154423 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.158551 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.158867 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.159183 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-h7lsd" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.159437 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.159878 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.173010 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c"] Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.256238 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba36975-65f4-4f71-a709-261d2b9255ea-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.256293 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pbhw\" (UniqueName: \"kubernetes.io/projected/aba36975-65f4-4f71-a709-261d2b9255ea-kube-api-access-8pbhw\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.256407 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/aba36975-65f4-4f71-a709-261d2b9255ea-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.256542 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba36975-65f4-4f71-a709-261d2b9255ea-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.256706 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/aba36975-65f4-4f71-a709-261d2b9255ea-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.357432 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/aba36975-65f4-4f71-a709-261d2b9255ea-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.357501 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba36975-65f4-4f71-a709-261d2b9255ea-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.357560 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/aba36975-65f4-4f71-a709-261d2b9255ea-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.357590 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba36975-65f4-4f71-a709-261d2b9255ea-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.357617 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pbhw\" (UniqueName: \"kubernetes.io/projected/aba36975-65f4-4f71-a709-261d2b9255ea-kube-api-access-8pbhw\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.359230 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba36975-65f4-4f71-a709-261d2b9255ea-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.360499 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aba36975-65f4-4f71-a709-261d2b9255ea-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.364751 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/aba36975-65f4-4f71-a709-261d2b9255ea-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.365698 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/aba36975-65f4-4f71-a709-261d2b9255ea-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.375576 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pbhw\" (UniqueName: \"kubernetes.io/projected/aba36975-65f4-4f71-a709-261d2b9255ea-kube-api-access-8pbhw\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-llw6c\" (UID: \"aba36975-65f4-4f71-a709-261d2b9255ea\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.378341 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm"] Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.395030 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm"] Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.395178 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.397482 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.399098 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-loki-s3" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.401532 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.462650 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.462703 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.462737 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.462762 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.462793 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8q97\" (UniqueName: \"kubernetes.io/projected/cad6276e-0607-49e0-8a90-a11e9b916991-kube-api-access-n8q97\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.462834 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cad6276e-0607-49e0-8a90-a11e9b916991-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.473521 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8"] Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.474061 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.474517 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.485100 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.485103 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.493444 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8"] Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.564859 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npn74\" (UniqueName: \"kubernetes.io/projected/9babbc99-4133-47c1-85e5-95039351727b-kube-api-access-npn74\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.564931 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.564976 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.565026 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.565059 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.565105 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8q97\" (UniqueName: \"kubernetes.io/projected/cad6276e-0607-49e0-8a90-a11e9b916991-kube-api-access-n8q97\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.565198 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cad6276e-0607-49e0-8a90-a11e9b916991-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.565235 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9babbc99-4133-47c1-85e5-95039351727b-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.565340 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/9babbc99-4133-47c1-85e5-95039351727b-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.565371 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9babbc99-4133-47c1-85e5-95039351727b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.565409 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/9babbc99-4133-47c1-85e5-95039351727b-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.566436 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cad6276e-0607-49e0-8a90-a11e9b916991-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.567008 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.574344 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.574892 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.575863 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/cad6276e-0607-49e0-8a90-a11e9b916991-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.580955 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2"] Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.582111 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.588548 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.588751 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.588943 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.591481 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.592059 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.595634 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8q97\" (UniqueName: \"kubernetes.io/projected/cad6276e-0607-49e0-8a90-a11e9b916991-kube-api-access-n8q97\") pod \"cloudkitty-lokistack-querier-58c84b5844-k6gcm\" (UID: \"cad6276e-0607-49e0-8a90-a11e9b916991\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.595963 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.598730 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2"] Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.626119 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g"] Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.627321 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.633053 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-b5w4v" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.651488 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g"] Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.667729 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg6mx\" (UniqueName: \"kubernetes.io/projected/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-kube-api-access-vg6mx\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.667780 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.667822 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.667847 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.667875 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668252 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668312 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668336 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668363 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668403 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9babbc99-4133-47c1-85e5-95039351727b-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668425 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668447 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668497 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668539 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/9babbc99-4133-47c1-85e5-95039351727b-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668569 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9babbc99-4133-47c1-85e5-95039351727b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668600 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668633 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/9babbc99-4133-47c1-85e5-95039351727b-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668663 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npn74\" (UniqueName: \"kubernetes.io/projected/9babbc99-4133-47c1-85e5-95039351727b-kube-api-access-npn74\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668693 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668749 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668782 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsgtc\" (UniqueName: \"kubernetes.io/projected/47cbe0b4-7d45-486b-9e9b-964db524e7ab-kube-api-access-wsgtc\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668828 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.668866 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.670027 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9babbc99-4133-47c1-85e5-95039351727b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.672592 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9babbc99-4133-47c1-85e5-95039351727b-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.675182 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/9babbc99-4133-47c1-85e5-95039351727b-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.691015 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npn74\" (UniqueName: \"kubernetes.io/projected/9babbc99-4133-47c1-85e5-95039351727b-kube-api-access-npn74\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.697237 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/9babbc99-4133-47c1-85e5-95039351727b-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8\" (UID: \"9babbc99-4133-47c1-85e5-95039351727b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.753572 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769426 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769464 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769496 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769516 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsgtc\" (UniqueName: \"kubernetes.io/projected/47cbe0b4-7d45-486b-9e9b-964db524e7ab-kube-api-access-wsgtc\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769540 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769563 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769582 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg6mx\" (UniqueName: \"kubernetes.io/projected/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-kube-api-access-vg6mx\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769600 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769624 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769660 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769679 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769708 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769728 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769766 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769804 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.769831 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: E0219 19:36:40.769947 4722 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Feb 19 19:36:40 crc kubenswrapper[4722]: E0219 19:36:40.769990 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-tls-secret podName:fc37f35d-ac2f-40a0-90e1-40c3b80b1782 nodeName:}" failed. No retries permitted until 2026-02-19 19:36:41.269972781 +0000 UTC m=+1100.882323105 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-tls-secret") pod "cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" (UID: "fc37f35d-ac2f-40a0-90e1-40c3b80b1782") : secret "cloudkitty-lokistack-gateway-http" not found Feb 19 19:36:40 crc kubenswrapper[4722]: E0219 19:36:40.770232 4722 secret.go:188] Couldn't get secret openstack/cloudkitty-lokistack-gateway-http: secret "cloudkitty-lokistack-gateway-http" not found Feb 19 19:36:40 crc kubenswrapper[4722]: E0219 19:36:40.770264 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-tls-secret podName:47cbe0b4-7d45-486b-9e9b-964db524e7ab nodeName:}" failed. No retries permitted until 2026-02-19 19:36:41.27025418 +0000 UTC m=+1100.882604504 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-tls-secret") pod "cloudkitty-lokistack-gateway-7f8685b49f-2j29g" (UID: "47cbe0b4-7d45-486b-9e9b-964db524e7ab") : secret "cloudkitty-lokistack-gateway-http" not found Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.771284 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.772494 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.773088 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.773770 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.774094 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.774382 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.775004 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.775945 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.776360 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.777515 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/47cbe0b4-7d45-486b-9e9b-964db524e7ab-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.777545 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.778280 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.778640 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.782364 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.798388 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg6mx\" (UniqueName: \"kubernetes.io/projected/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-kube-api-access-vg6mx\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.798854 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:36:40 crc kubenswrapper[4722]: I0219 19:36:40.804935 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsgtc\" (UniqueName: \"kubernetes.io/projected/47cbe0b4-7d45-486b-9e9b-964db524e7ab-kube-api-access-wsgtc\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.276962 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.277287 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.287108 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/47cbe0b4-7d45-486b-9e9b-964db524e7ab-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-2j29g\" (UID: \"47cbe0b4-7d45-486b-9e9b-964db524e7ab\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.291164 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/fc37f35d-ac2f-40a0-90e1-40c3b80b1782-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-qxjk2\" (UID: \"fc37f35d-ac2f-40a0-90e1-40c3b80b1782\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.330491 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.331474 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.337767 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.338069 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.350027 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.425687 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.428945 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.431472 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.432349 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.442659 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.482074 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.482175 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.482206 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.482284 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.482308 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.482355 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.482410 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9znm\" (UniqueName: \"kubernetes.io/projected/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-kube-api-access-p9znm\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.482449 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.528116 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.529466 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.533647 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.534215 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.563232 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.571166 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.578908 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.583850 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.583904 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhls5\" (UniqueName: \"kubernetes.io/projected/53bc8f19-43b1-4297-a3db-986381793b6e-kube-api-access-hhls5\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.583939 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.583964 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.583998 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584020 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584058 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bc8f19-43b1-4297-a3db-986381793b6e-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584090 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584116 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584175 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584199 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584221 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584263 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584313 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9znm\" (UniqueName: \"kubernetes.io/projected/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-kube-api-access-p9znm\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584353 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584704 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.585402 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.585459 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.584717 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.587886 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.588108 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.614005 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9znm\" (UniqueName: \"kubernetes.io/projected/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-kube-api-access-p9znm\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.617844 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/a3fc19f1-6f9f-4f35-a391-1f6743480bd3-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.633541 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.646940 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"a3fc19f1-6f9f-4f35-a391-1f6743480bd3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.686856 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.686908 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.686937 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhls5\" (UniqueName: \"kubernetes.io/projected/53bc8f19-43b1-4297-a3db-986381793b6e-kube-api-access-hhls5\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.686954 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.686979 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.687005 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.687034 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bc8f19-43b1-4297-a3db-986381793b6e-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.687054 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.687072 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.687100 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.687116 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkrxm\" (UniqueName: \"kubernetes.io/projected/15869f30-52a4-4db0-aca8-53c5b319f7a1-kube-api-access-rkrxm\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.687132 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15869f30-52a4-4db0-aca8-53c5b319f7a1-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.687169 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.687196 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.689189 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.690704 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53bc8f19-43b1-4297-a3db-986381793b6e-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.691824 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.692059 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.695842 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.704082 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/53bc8f19-43b1-4297-a3db-986381793b6e-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.704641 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhls5\" (UniqueName: \"kubernetes.io/projected/53bc8f19-43b1-4297-a3db-986381793b6e-kube-api-access-hhls5\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.710798 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"53bc8f19-43b1-4297-a3db-986381793b6e\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.746222 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.789933 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkrxm\" (UniqueName: \"kubernetes.io/projected/15869f30-52a4-4db0-aca8-53c5b319f7a1-kube-api-access-rkrxm\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.790041 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15869f30-52a4-4db0-aca8-53c5b319f7a1-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.790087 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.790185 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.790999 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15869f30-52a4-4db0-aca8-53c5b319f7a1-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.791013 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.792431 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.792438 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.792650 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.792754 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.795528 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.795525 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.797897 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/15869f30-52a4-4db0-aca8-53c5b319f7a1-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.812005 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkrxm\" (UniqueName: \"kubernetes.io/projected/15869f30-52a4-4db0-aca8-53c5b319f7a1-kube-api-access-rkrxm\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.821205 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"15869f30-52a4-4db0-aca8-53c5b319f7a1\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.891689 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:36:41 crc kubenswrapper[4722]: I0219 19:36:41.945818 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.065962 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.066664 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8pmh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-nngml_openstack(af3a7297-2590-4a47-baa0-cd5b6029b6a4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.067861 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" podUID="af3a7297-2590-4a47-baa0-cd5b6029b6a4" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.145599 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.145773 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hcgth,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-9t97w_openstack(7421fc0e-3cfa-49af-a4c0-90807314bb61): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.147345 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" podUID="7421fc0e-3cfa-49af-a4c0-90807314bb61" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.217097 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.217285 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wpz6x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-jmsp2_openstack(17b6c8b5-9711-4601-a0fd-a1f528e97287): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.219551 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" podUID="17b6c8b5-9711-4601-a0fd-a1f528e97287" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.241657 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.242094 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kf2km,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-dt86l_openstack(4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.248249 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" podUID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.907867 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" podUID="17b6c8b5-9711-4601-a0fd-a1f528e97287" Feb 19 19:36:48 crc kubenswrapper[4722]: E0219 19:36:48.915461 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" podUID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" Feb 19 19:36:48 crc kubenswrapper[4722]: I0219 19:36:48.964099 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.065807 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.190707 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.226143 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.784755 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.796566 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.909389 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.909580 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45","Type":"ContainerStarted","Data":"c749648f12e8840f28b25f37f34a53275ed4fc33d82900da005066210acf9af2"} Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.917496 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"53444e7f-4c1d-401b-9896-5ff9c4aab65a","Type":"ContainerStarted","Data":"f1e06939ba16d5e69507fa3a579142662a387babc305acdf6b56da52073daf71"} Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.919418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" event={"ID":"af3a7297-2590-4a47-baa0-cd5b6029b6a4","Type":"ContainerDied","Data":"bfeedc6c93dfc0ff6029b23856b61146b86096c4ced635aad0f319bef89af5a1"} Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.919472 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-nngml" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.920556 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"059950bd-4e60-42e6-a9c6-4e4ab0b039aa","Type":"ContainerStarted","Data":"29233f754c6e0594fc2dce8d38065ec8f549ab8811984c4eb517f6e5dc70ca5f"} Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.922388 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" event={"ID":"7421fc0e-3cfa-49af-a4c0-90807314bb61","Type":"ContainerDied","Data":"9c8b478ee26b023117fcef2ff288db2274a47d09a770d88e80127097d4da0163"} Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.922766 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-9t97w" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.931695 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2"] Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.945469 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a07f9633-74f5-48e5-8467-d649fc49a2ff","Type":"ContainerStarted","Data":"d4965ef535ae6d75f99b0767c9cb18768aaccd6cfe6fd189dc3e54bc294d5b21"} Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.952970 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"78e7f1b1-be76-4f05-bd63-ff87b440e173","Type":"ContainerStarted","Data":"9eab85e47b8d4d46d4d09632c13f9eaf4c33976add1d0726acbd83490a88c6c1"} Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.957217 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8pmh\" (UniqueName: \"kubernetes.io/projected/af3a7297-2590-4a47-baa0-cd5b6029b6a4-kube-api-access-r8pmh\") pod \"af3a7297-2590-4a47-baa0-cd5b6029b6a4\" (UID: \"af3a7297-2590-4a47-baa0-cd5b6029b6a4\") " Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.957297 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-config\") pod \"7421fc0e-3cfa-49af-a4c0-90807314bb61\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.957513 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-dns-svc\") pod \"7421fc0e-3cfa-49af-a4c0-90807314bb61\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.957569 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3a7297-2590-4a47-baa0-cd5b6029b6a4-config\") pod \"af3a7297-2590-4a47-baa0-cd5b6029b6a4\" (UID: \"af3a7297-2590-4a47-baa0-cd5b6029b6a4\") " Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.957643 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcgth\" (UniqueName: \"kubernetes.io/projected/7421fc0e-3cfa-49af-a4c0-90807314bb61-kube-api-access-hcgth\") pod \"7421fc0e-3cfa-49af-a4c0-90807314bb61\" (UID: \"7421fc0e-3cfa-49af-a4c0-90807314bb61\") " Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.958287 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7421fc0e-3cfa-49af-a4c0-90807314bb61" (UID: "7421fc0e-3cfa-49af-a4c0-90807314bb61"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.958326 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-config" (OuterVolumeSpecName: "config") pod "7421fc0e-3cfa-49af-a4c0-90807314bb61" (UID: "7421fc0e-3cfa-49af-a4c0-90807314bb61"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.958326 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af3a7297-2590-4a47-baa0-cd5b6029b6a4-config" (OuterVolumeSpecName: "config") pod "af3a7297-2590-4a47-baa0-cd5b6029b6a4" (UID: "af3a7297-2590-4a47-baa0-cd5b6029b6a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.959335 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.959370 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7421fc0e-3cfa-49af-a4c0-90807314bb61-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.959378 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af3a7297-2590-4a47-baa0-cd5b6029b6a4-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.962678 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8"] Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.963795 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7421fc0e-3cfa-49af-a4c0-90807314bb61-kube-api-access-hcgth" (OuterVolumeSpecName: "kube-api-access-hcgth") pod "7421fc0e-3cfa-49af-a4c0-90807314bb61" (UID: "7421fc0e-3cfa-49af-a4c0-90807314bb61"). InnerVolumeSpecName "kube-api-access-hcgth". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.965795 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3a7297-2590-4a47-baa0-cd5b6029b6a4-kube-api-access-r8pmh" (OuterVolumeSpecName: "kube-api-access-r8pmh") pod "af3a7297-2590-4a47-baa0-cd5b6029b6a4" (UID: "af3a7297-2590-4a47-baa0-cd5b6029b6a4"). InnerVolumeSpecName "kube-api-access-r8pmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:36:49 crc kubenswrapper[4722]: W0219 19:36:49.977457 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda08df2e8_3f03_4e9c_91cf_2890026b9d76.slice/crio-3162a6ec047952b568aaa8c73c253863e5256bf0afc61efa90f1b0efd37039e5 WatchSource:0}: Error finding container 3162a6ec047952b568aaa8c73c253863e5256bf0afc61efa90f1b0efd37039e5: Status 404 returned error can't find the container with id 3162a6ec047952b568aaa8c73c253863e5256bf0afc61efa90f1b0efd37039e5 Feb 19 19:36:49 crc kubenswrapper[4722]: I0219 19:36:49.981306 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:36:49 crc kubenswrapper[4722]: W0219 19:36:49.983840 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9babbc99_4133_47c1_85e5_95039351727b.slice/crio-8c54a7393544cb2ec46a727e560fc4f5a2ca6dfb7c19b743bd4f35274cdc6aed WatchSource:0}: Error finding container 8c54a7393544cb2ec46a727e560fc4f5a2ca6dfb7c19b743bd4f35274cdc6aed: Status 404 returned error can't find the container with id 8c54a7393544cb2ec46a727e560fc4f5a2ca6dfb7c19b743bd4f35274cdc6aed Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.061398 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcgth\" (UniqueName: \"kubernetes.io/projected/7421fc0e-3cfa-49af-a4c0-90807314bb61-kube-api-access-hcgth\") on node \"crc\" DevicePath \"\"" Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.061438 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8pmh\" (UniqueName: \"kubernetes.io/projected/af3a7297-2590-4a47-baa0-cd5b6029b6a4-kube-api-access-r8pmh\") on node \"crc\" DevicePath \"\"" Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.273777 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm"] Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.297944 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6tmmr"] Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.312011 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.328709 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.364755 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nngml"] Feb 19 19:36:50 crc kubenswrapper[4722]: W0219 19:36:50.371629 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3fc19f1_6f9f_4f35_a391_1f6743480bd3.slice/crio-49b4c1dd3ea0999cdd83dcd2e854e188c74d02443bd7c56d2f0d5c378a3783ff WatchSource:0}: Error finding container 49b4c1dd3ea0999cdd83dcd2e854e188c74d02443bd7c56d2f0d5c378a3783ff: Status 404 returned error can't find the container with id 49b4c1dd3ea0999cdd83dcd2e854e188c74d02443bd7c56d2f0d5c378a3783ff Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.374204 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-nngml"] Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.396076 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9t97w"] Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.418128 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-9t97w"] Feb 19 19:36:50 crc kubenswrapper[4722]: W0219 19:36:50.427281 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47cbe0b4_7d45_486b_9e9b_964db524e7ab.slice/crio-887bf678f363ef94f9ad923d8230423441c6e8e8bc4cd18d84da825ca392b3ce WatchSource:0}: Error finding container 887bf678f363ef94f9ad923d8230423441c6e8e8bc4cd18d84da825ca392b3ce: Status 404 returned error can't find the container with id 887bf678f363ef94f9ad923d8230423441c6e8e8bc4cd18d84da825ca392b3ce Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.432138 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g"] Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.460088 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.460198 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.470165 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c"] Feb 19 19:36:50 crc kubenswrapper[4722]: W0219 19:36:50.539668 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53bc8f19_43b1_4297_a3db_986381793b6e.slice/crio-9bcbba825dfff1439d6c300bd2d4c898b7f72ff45b236eff201c32ed80ceedc3 WatchSource:0}: Error finding container 9bcbba825dfff1439d6c300bd2d4c898b7f72ff45b236eff201c32ed80ceedc3: Status 404 returned error can't find the container with id 9bcbba825dfff1439d6c300bd2d4c898b7f72ff45b236eff201c32ed80ceedc3 Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.541518 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-fwvrs"] Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.964829 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"53bc8f19-43b1-4297-a3db-986381793b6e","Type":"ContainerStarted","Data":"9bcbba825dfff1439d6c300bd2d4c898b7f72ff45b236eff201c32ed80ceedc3"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.967657 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"15869f30-52a4-4db0-aca8-53c5b319f7a1","Type":"ContainerStarted","Data":"aca690584825492c1c0bff04b230724017b6329ddea13ba665c0bc38724b6aac"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.968691 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fwvrs" event={"ID":"c8300e35-4c72-4398-9058-0aa76005d576","Type":"ContainerStarted","Data":"e660effba251555c0c949b7e36ff3e2dc66e7b37a63e0229bb54eb33f555314b"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.970072 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" event={"ID":"9babbc99-4133-47c1-85e5-95039351727b","Type":"ContainerStarted","Data":"8c54a7393544cb2ec46a727e560fc4f5a2ca6dfb7c19b743bd4f35274cdc6aed"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.974082 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6tmmr" event={"ID":"293cde43-7bcf-4638-a080-badb26c81138","Type":"ContainerStarted","Data":"c1ba056f6d26a097d50633f3948e29ae5a7be3e9d18eae86e25eecf62e11ef0a"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.976298 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" event={"ID":"47cbe0b4-7d45-486b-9e9b-964db524e7ab","Type":"ContainerStarted","Data":"887bf678f363ef94f9ad923d8230423441c6e8e8bc4cd18d84da825ca392b3ce"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.979289 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f","Type":"ContainerStarted","Data":"e127436a9b7fd84ddf258ebc3a3c64c5ddb9a7269490c5535eccdc44ec44422d"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.982826 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"14a7aae0-6a51-49ed-b4dd-9b274885d1da","Type":"ContainerStarted","Data":"5545dc8f3e2de249c7840626da07d4ee4ba5dd553856353617c9f89c2873d54d"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.984486 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"13228713-9349-4241-b1f7-67f9a2c705fa","Type":"ContainerStarted","Data":"291a68b55c5e55f85f7ff850e7faae5b89ae9c96777eeecedb246b9d8bc560b6"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.985738 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" event={"ID":"aba36975-65f4-4f71-a709-261d2b9255ea","Type":"ContainerStarted","Data":"5cc4476d61dd69eeb9b1a29772eb754a6f66b053a97c83b9236a853d67923df4"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.986833 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" event={"ID":"fc37f35d-ac2f-40a0-90e1-40c3b80b1782","Type":"ContainerStarted","Data":"11f7874fd713795c3df3569dfb6f3b2b4fc68b64a645b032ff0ea869617a710d"} Feb 19 19:36:50 crc kubenswrapper[4722]: I0219 19:36:50.994272 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a08df2e8-3f03-4e9c-91cf-2890026b9d76","Type":"ContainerStarted","Data":"3162a6ec047952b568aaa8c73c253863e5256bf0afc61efa90f1b0efd37039e5"} Feb 19 19:36:51 crc kubenswrapper[4722]: I0219 19:36:51.026300 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"a3fc19f1-6f9f-4f35-a391-1f6743480bd3","Type":"ContainerStarted","Data":"49b4c1dd3ea0999cdd83dcd2e854e188c74d02443bd7c56d2f0d5c378a3783ff"} Feb 19 19:36:51 crc kubenswrapper[4722]: I0219 19:36:51.031996 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" event={"ID":"cad6276e-0607-49e0-8a90-a11e9b916991","Type":"ContainerStarted","Data":"23cfa51f66b510a9f59c6a0ad5b9d6cad884a31e01018fe52264bc7647f4333e"} Feb 19 19:36:51 crc kubenswrapper[4722]: I0219 19:36:51.086007 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7421fc0e-3cfa-49af-a4c0-90807314bb61" path="/var/lib/kubelet/pods/7421fc0e-3cfa-49af-a4c0-90807314bb61/volumes" Feb 19 19:36:51 crc kubenswrapper[4722]: I0219 19:36:51.086410 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af3a7297-2590-4a47-baa0-cd5b6029b6a4" path="/var/lib/kubelet/pods/af3a7297-2590-4a47-baa0-cd5b6029b6a4/volumes" Feb 19 19:36:51 crc kubenswrapper[4722]: I0219 19:36:51.320083 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 19:36:53 crc kubenswrapper[4722]: I0219 19:36:53.068211 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"05a27e5a-189e-4d17-9823-d95ef7906a7b","Type":"ContainerStarted","Data":"d68eda65528da0413baa83abe4fa0066a48a389941ffe61388b361110f4fd855"} Feb 19 19:37:04 crc kubenswrapper[4722]: E0219 19:37:04.417823 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified" Feb 19 19:37:04 crc kubenswrapper[4722]: E0219 19:37:04.418784 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:ovsdb-server-init,Image:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,Command:[/usr/local/bin/container-scripts/init-ovsdb-server.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nc4h96h589h559h54ch5ddh89h5fdh5c4h575h578h696h589h5dbh594h9bhb5hf5h655h57dh54dh558h55ch54h67fh5bch66bhc5h577h687h598h76q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-ovs,ReadOnly:false,MountPath:/etc/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log,ReadOnly:false,MountPath:/var/log/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib,ReadOnly:false,MountPath:/var/lib/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-55j87,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-ovs-fwvrs_openstack(c8300e35-4c72-4398-9058-0aa76005d576): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:37:04 crc kubenswrapper[4722]: E0219 19:37:04.420860 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-ovs-fwvrs" podUID="c8300e35-4c72-4398-9058-0aa76005d576" Feb 19 19:37:04 crc kubenswrapper[4722]: E0219 19:37:04.490908 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Feb 19 19:37:04 crc kubenswrapper[4722]: E0219 19:37:04.491100 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-twqwb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(a07f9633-74f5-48e5-8467-d649fc49a2ff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:37:04 crc kubenswrapper[4722]: E0219 19:37:04.492260 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="a07f9633-74f5-48e5-8467-d649fc49a2ff" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.104791 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.105016 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:gateway,Image:registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934,Command:[],Args:[--debug.name=lokistack-gateway --web.listen=0.0.0.0:8080 --web.internal.listen=0.0.0.0:8081 --web.healthchecks.url=https://localhost:8080 --log.level=warn --logs.read.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.tail.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.write.endpoint=https://cloudkitty-lokistack-distributor-http.openstack.svc.cluster.local:3100 --logs.write-timeout=4m0s --rbac.config=/etc/lokistack-gateway/rbac.yaml --tenants.config=/etc/lokistack-gateway/tenants.yaml --server.read-timeout=48s --server.write-timeout=6m0s --tls.min-version=VersionTLS12 --tls.server.cert-file=/var/run/tls/http/server/tls.crt --tls.server.key-file=/var/run/tls/http/server/tls.key --tls.healthchecks.server-ca-file=/var/run/ca/server/service-ca.crt --tls.healthchecks.server-name=cloudkitty-lokistack-gateway-http.openstack.svc.cluster.local --tls.internal.server.cert-file=/var/run/tls/http/server/tls.crt --tls.internal.server.key-file=/var/run/tls/http/server/tls.key --tls.min-version=VersionTLS12 --tls.cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --logs.tls.ca-file=/var/run/ca/upstream/service-ca.crt --logs.tls.cert-file=/var/run/tls/http/upstream/tls.crt --logs.tls.key-file=/var/run/tls/http/upstream/tls.key --tls.client-auth-type=RequestClientCert],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},ContainerPort{Name:public,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rbac,ReadOnly:true,MountPath:/etc/lokistack-gateway/rbac.yaml,SubPath:rbac.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tenants,ReadOnly:true,MountPath:/etc/lokistack-gateway/tenants.yaml,SubPath:tenants.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:lokistack-gateway,ReadOnly:true,MountPath:/etc/lokistack-gateway/lokistack-gateway.rego,SubPath:lokistack-gateway.rego,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-secret,ReadOnly:true,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-client-http,ReadOnly:true,MountPath:/var/run/tls/http/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-ca-bundle,ReadOnly:false,MountPath:/var/run/tenants-ca/cloudkitty,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vg6mx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/live,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:12,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-gateway-7f8685b49f-qxjk2_openstack(fc37f35d-ac2f-40a0-90e1-40c3b80b1782): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.106440 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" podUID="fc37f35d-ac2f-40a0-90e1-40c3b80b1782" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.127292 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.127479 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:gateway,Image:registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934,Command:[],Args:[--debug.name=lokistack-gateway --web.listen=0.0.0.0:8080 --web.internal.listen=0.0.0.0:8081 --web.healthchecks.url=https://localhost:8080 --log.level=warn --logs.read.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.tail.endpoint=https://cloudkitty-lokistack-query-frontend-http.openstack.svc.cluster.local:3100 --logs.write.endpoint=https://cloudkitty-lokistack-distributor-http.openstack.svc.cluster.local:3100 --logs.write-timeout=4m0s --rbac.config=/etc/lokistack-gateway/rbac.yaml --tenants.config=/etc/lokistack-gateway/tenants.yaml --server.read-timeout=48s --server.write-timeout=6m0s --tls.min-version=VersionTLS12 --tls.server.cert-file=/var/run/tls/http/server/tls.crt --tls.server.key-file=/var/run/tls/http/server/tls.key --tls.healthchecks.server-ca-file=/var/run/ca/server/service-ca.crt --tls.healthchecks.server-name=cloudkitty-lokistack-gateway-http.openstack.svc.cluster.local --tls.internal.server.cert-file=/var/run/tls/http/server/tls.crt --tls.internal.server.key-file=/var/run/tls/http/server/tls.key --tls.min-version=VersionTLS12 --tls.cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --logs.tls.ca-file=/var/run/ca/upstream/service-ca.crt --logs.tls.cert-file=/var/run/tls/http/upstream/tls.crt --logs.tls.key-file=/var/run/tls/http/upstream/tls.key --tls.client-auth-type=RequestClientCert],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},ContainerPort{Name:public,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rbac,ReadOnly:true,MountPath:/etc/lokistack-gateway/rbac.yaml,SubPath:rbac.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tenants,ReadOnly:true,MountPath:/etc/lokistack-gateway/tenants.yaml,SubPath:tenants.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:lokistack-gateway,ReadOnly:true,MountPath:/etc/lokistack-gateway/lokistack-gateway.rego,SubPath:lokistack-gateway.rego,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-secret,ReadOnly:true,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-client-http,ReadOnly:true,MountPath:/var/run/tls/http/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/upstream,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-gateway-ca-bundle,ReadOnly:true,MountPath:/var/run/ca/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-ca-bundle,ReadOnly:false,MountPath:/var/run/tenants-ca/cloudkitty,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wsgtc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/live,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 8081 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:12,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-gateway-7f8685b49f-2j29g_openstack(47cbe0b4-7d45-486b-9e9b-964db524e7ab): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.128713 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" podUID="47cbe0b4-7d45-486b-9e9b-964db524e7ab" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.196838 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified\\\"\"" pod="openstack/ovn-controller-ovs-fwvrs" podUID="c8300e35-4c72-4398-9058-0aa76005d576" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.201321 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.201524 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-ingester,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981,Command:[],Args:[-target=ingester -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:gossip-ring,HostPort:0,ContainerPort:7946,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:storage,ReadOnly:false,MountPath:/tmp/loki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:wal,ReadOnly:false,MountPath:/tmp/wal,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ingester-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ingester-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p9znm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-ingester-0_openstack(a3fc19f1-6f9f-4f35-a391-1f6743480bd3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.202735 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-ingester\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="a3fc19f1-6f9f-4f35-a391-1f6743480bd3" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.202756 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934\\\"\"" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" podUID="fc37f35d-ac2f-40a0-90e1-40c3b80b1782" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.202729 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"gateway\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:41eda20b890c200ee7fce0b56b5d168445cd9a6486d560f39ce73d0704e03934\\\"\"" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" podUID="47cbe0b4-7d45-486b-9e9b-964db524e7ab" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.203645 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.203794 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-querier,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981,Command:[],Args:[-target=querier -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:gossip-ring,HostPort:0,ContainerPort:7946,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-querier-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-querier-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n8q97,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-querier-58c84b5844-k6gcm_openstack(cad6276e-0607-49e0-8a90-a11e9b916991): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.204976 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-querier\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" podUID="cad6276e-0607-49e0-8a90-a11e9b916991" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.208860 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:9a2097bc5b2e02bc1703f64c452ce8fe4bc6775b732db930ff4770b76ae4653a" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.209015 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init-config-reloader,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:9a2097bc5b2e02bc1703f64c452ce8fe4bc6775b732db930ff4770b76ae4653a,Command:[/bin/prometheus-config-reloader],Args:[--watch-interval=0 --listen-address=:8081 --config-file=/etc/prometheus/config/prometheus.yaml.gz --config-envsubst-file=/etc/prometheus/config_out/prometheus.env.yaml --watched-dir=/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0 --watched-dir=/etc/prometheus/rules/prometheus-metric-storage-rulefiles-1 --watched-dir=/etc/prometheus/rules/prometheus-metric-storage-rulefiles-2],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:reloader-init,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:SHARD,Value:0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/prometheus/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-out,ReadOnly:false,MountPath:/etc/prometheus/config_out,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-0,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-1,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-1,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-2,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-2,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nrjf5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod prometheus-metric-storage-0_openstack(a08df2e8-3f03-4e9c-91cf-2890026b9d76): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:37:05 crc kubenswrapper[4722]: E0219 19:37:05.210119 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init-config-reloader\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/prometheus-metric-storage-0" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" Feb 19 19:37:06 crc kubenswrapper[4722]: E0219 19:37:06.465197 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Feb 19 19:37:06 crc kubenswrapper[4722]: E0219 19:37:06.465757 4722 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Feb 19 19:37:06 crc kubenswrapper[4722]: E0219 19:37:06.465882 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8c6z8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(14a7aae0-6a51-49ed-b4dd-9b274885d1da): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 19:37:06 crc kubenswrapper[4722]: E0219 19:37:06.467088 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="14a7aae0-6a51-49ed-b4dd-9b274885d1da" Feb 19 19:37:07 crc kubenswrapper[4722]: I0219 19:37:07.206918 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a07f9633-74f5-48e5-8467-d649fc49a2ff","Type":"ContainerStarted","Data":"c4e1aaf5987ae7613ffce2d46a9e81da3905d6934748687233a4bb3c5c36ee1f"} Feb 19 19:37:07 crc kubenswrapper[4722]: I0219 19:37:07.208859 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"15869f30-52a4-4db0-aca8-53c5b319f7a1","Type":"ContainerStarted","Data":"7e4f75abf37ffede84097a88c6af1bf378a8112527438e23b556b46ba20eb725"} Feb 19 19:37:07 crc kubenswrapper[4722]: I0219 19:37:07.209193 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:37:07 crc kubenswrapper[4722]: I0219 19:37:07.210449 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" event={"ID":"9babbc99-4133-47c1-85e5-95039351727b","Type":"ContainerStarted","Data":"03851b96b20e4de777f586e529e4d46af2053221200c0ef3829199d739e1473d"} Feb 19 19:37:07 crc kubenswrapper[4722]: I0219 19:37:07.210693 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:37:07 crc kubenswrapper[4722]: I0219 19:37:07.213119 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"059950bd-4e60-42e6-a9c6-4e4ab0b039aa","Type":"ContainerStarted","Data":"c04723d9f530225ce94b410cbfa59b3fac45d29eee478a25a703ed87bb459cf2"} Feb 19 19:37:07 crc kubenswrapper[4722]: I0219 19:37:07.213173 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 19 19:37:07 crc kubenswrapper[4722]: E0219 19:37:07.218858 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="14a7aae0-6a51-49ed-b4dd-9b274885d1da" Feb 19 19:37:07 crc kubenswrapper[4722]: I0219 19:37:07.256177 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=11.784457127 podStartE2EDuration="27.256138672s" podCreationTimestamp="2026-02-19 19:36:40 +0000 UTC" firstStartedPulling="2026-02-19 19:36:50.469184272 +0000 UTC m=+1110.081534596" lastFinishedPulling="2026-02-19 19:37:05.940865817 +0000 UTC m=+1125.553216141" observedRunningTime="2026-02-19 19:37:07.249133795 +0000 UTC m=+1126.861484159" watchObservedRunningTime="2026-02-19 19:37:07.256138672 +0000 UTC m=+1126.868488996" Feb 19 19:37:07 crc kubenswrapper[4722]: I0219 19:37:07.270142 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" podStartSLOduration=11.540327502 podStartE2EDuration="27.270123577s" podCreationTimestamp="2026-02-19 19:36:40 +0000 UTC" firstStartedPulling="2026-02-19 19:36:50.00341815 +0000 UTC m=+1109.615768474" lastFinishedPulling="2026-02-19 19:37:05.733214225 +0000 UTC m=+1125.345564549" observedRunningTime="2026-02-19 19:37:07.264530813 +0000 UTC m=+1126.876881137" watchObservedRunningTime="2026-02-19 19:37:07.270123577 +0000 UTC m=+1126.882473901" Feb 19 19:37:07 crc kubenswrapper[4722]: I0219 19:37:07.309748 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=24.987636365 podStartE2EDuration="41.30972427s" podCreationTimestamp="2026-02-19 19:36:26 +0000 UTC" firstStartedPulling="2026-02-19 19:36:49.302338705 +0000 UTC m=+1108.914689029" lastFinishedPulling="2026-02-19 19:37:05.62442661 +0000 UTC m=+1125.236776934" observedRunningTime="2026-02-19 19:37:07.301130932 +0000 UTC m=+1126.913481256" watchObservedRunningTime="2026-02-19 19:37:07.30972427 +0000 UTC m=+1126.922074594" Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.218709 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6tmmr" event={"ID":"293cde43-7bcf-4638-a080-badb26c81138","Type":"ContainerStarted","Data":"d1883e01a92455fb7d08ab0d5b74082dd3d67da4295aa6046b5334330d7445e4"} Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.219036 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-6tmmr" Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.220972 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" event={"ID":"aba36975-65f4-4f71-a709-261d2b9255ea","Type":"ContainerStarted","Data":"1db9ab7e51ecff4bb8b508a812b573f7e7fa538e8898e96a6c05d7db6dd9f9c6"} Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.221120 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.223014 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"13228713-9349-4241-b1f7-67f9a2c705fa","Type":"ContainerStarted","Data":"62db51a139bce149b57aa8c7df3c09f8b6a05984eb5694fd234c9b52b703e4bc"} Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.224511 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"05a27e5a-189e-4d17-9823-d95ef7906a7b","Type":"ContainerStarted","Data":"f484cd67b93ed6d2d71d614d82319268658739c5cac1f6e4267d76484d315d23"} Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.226710 4722 generic.go:334] "Generic (PLEG): container finished" podID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" containerID="99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97" exitCode=0 Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.226757 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" event={"ID":"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465","Type":"ContainerDied","Data":"99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97"} Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.232428 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"53444e7f-4c1d-401b-9896-5ff9c4aab65a","Type":"ContainerStarted","Data":"13563db377a2d356d6c5e051100eb3fdec737b3d97a75f97173241fd5519e50d"} Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.235692 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"53bc8f19-43b1-4297-a3db-986381793b6e","Type":"ContainerStarted","Data":"a6e3a7de83d225eadcc665957ce6c87d1619bbdc666aa3816ec19858e9188fdd"} Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.236457 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.247762 4722 generic.go:334] "Generic (PLEG): container finished" podID="17b6c8b5-9711-4601-a0fd-a1f528e97287" containerID="a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1" exitCode=0 Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.247861 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" event={"ID":"17b6c8b5-9711-4601-a0fd-a1f528e97287","Type":"ContainerDied","Data":"a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1"} Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.248911 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-6tmmr" podStartSLOduration=19.771968806 podStartE2EDuration="35.248881713s" podCreationTimestamp="2026-02-19 19:36:33 +0000 UTC" firstStartedPulling="2026-02-19 19:36:50.372435112 +0000 UTC m=+1109.984785436" lastFinishedPulling="2026-02-19 19:37:05.849348019 +0000 UTC m=+1125.461698343" observedRunningTime="2026-02-19 19:37:08.238902762 +0000 UTC m=+1127.851253086" watchObservedRunningTime="2026-02-19 19:37:08.248881713 +0000 UTC m=+1127.861232047" Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.283940 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=12.89198666 podStartE2EDuration="28.283922263s" podCreationTimestamp="2026-02-19 19:36:40 +0000 UTC" firstStartedPulling="2026-02-19 19:36:50.548737028 +0000 UTC m=+1110.161087352" lastFinishedPulling="2026-02-19 19:37:05.940672641 +0000 UTC m=+1125.553022955" observedRunningTime="2026-02-19 19:37:08.268202214 +0000 UTC m=+1127.880552558" watchObservedRunningTime="2026-02-19 19:37:08.283922263 +0000 UTC m=+1127.896272587" Feb 19 19:37:08 crc kubenswrapper[4722]: I0219 19:37:08.300980 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" podStartSLOduration=12.902858558 podStartE2EDuration="28.300956773s" podCreationTimestamp="2026-02-19 19:36:40 +0000 UTC" firstStartedPulling="2026-02-19 19:36:50.542615347 +0000 UTC m=+1110.154965671" lastFinishedPulling="2026-02-19 19:37:05.940713562 +0000 UTC m=+1125.553063886" observedRunningTime="2026-02-19 19:37:08.289412103 +0000 UTC m=+1127.901762477" watchObservedRunningTime="2026-02-19 19:37:08.300956773 +0000 UTC m=+1127.913307117" Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.261851 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" event={"ID":"17b6c8b5-9711-4601-a0fd-a1f528e97287","Type":"ContainerStarted","Data":"3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8"} Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.263712 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.267633 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" event={"ID":"cad6276e-0607-49e0-8a90-a11e9b916991","Type":"ContainerStarted","Data":"ba81c6123c8a4f0cc4800e1eccdf8da2890142be6cd3b5f5778d095d7e56cccf"} Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.267864 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.269721 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" event={"ID":"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465","Type":"ContainerStarted","Data":"a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434"} Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.270292 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.272226 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"a3fc19f1-6f9f-4f35-a391-1f6743480bd3","Type":"ContainerStarted","Data":"ab71ad53934879b34f098a9a5647e7bcb7133425b126895ab692a938daacf85d"} Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.272986 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.295011 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" podStartSLOduration=4.719654303 podStartE2EDuration="47.294989363s" podCreationTimestamp="2026-02-19 19:36:22 +0000 UTC" firstStartedPulling="2026-02-19 19:36:23.886493827 +0000 UTC m=+1083.498844151" lastFinishedPulling="2026-02-19 19:37:06.461828887 +0000 UTC m=+1126.074179211" observedRunningTime="2026-02-19 19:37:09.28237241 +0000 UTC m=+1128.894722744" watchObservedRunningTime="2026-02-19 19:37:09.294989363 +0000 UTC m=+1128.907339687" Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.335433 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" podStartSLOduration=4.451686527 podStartE2EDuration="47.33539446s" podCreationTimestamp="2026-02-19 19:36:22 +0000 UTC" firstStartedPulling="2026-02-19 19:36:23.574118649 +0000 UTC m=+1083.186468973" lastFinishedPulling="2026-02-19 19:37:06.457826582 +0000 UTC m=+1126.070176906" observedRunningTime="2026-02-19 19:37:09.296225561 +0000 UTC m=+1128.908575895" watchObservedRunningTime="2026-02-19 19:37:09.33539446 +0000 UTC m=+1128.947744784" Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.346447 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=-9223372007.508347 podStartE2EDuration="29.346429443s" podCreationTimestamp="2026-02-19 19:36:40 +0000 UTC" firstStartedPulling="2026-02-19 19:36:50.374825336 +0000 UTC m=+1109.987175660" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:37:09.323731638 +0000 UTC m=+1128.936082002" watchObservedRunningTime="2026-02-19 19:37:09.346429443 +0000 UTC m=+1128.958779767" Feb 19 19:37:09 crc kubenswrapper[4722]: I0219 19:37:09.358016 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" podStartSLOduration=-9223372007.496778 podStartE2EDuration="29.357997613s" podCreationTimestamp="2026-02-19 19:36:40 +0000 UTC" firstStartedPulling="2026-02-19 19:36:50.294557829 +0000 UTC m=+1109.906908153" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:37:09.338380103 +0000 UTC m=+1128.950730447" watchObservedRunningTime="2026-02-19 19:37:09.357997613 +0000 UTC m=+1128.970347937" Feb 19 19:37:10 crc kubenswrapper[4722]: I0219 19:37:10.283769 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a08df2e8-3f03-4e9c-91cf-2890026b9d76","Type":"ContainerStarted","Data":"a57f0a1057a7622bf6cd5a97f7d1c754dd0d44986fc9d7f455890c4bc7caac51"} Feb 19 19:37:10 crc kubenswrapper[4722]: I0219 19:37:10.285759 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"78e7f1b1-be76-4f05-bd63-ff87b440e173","Type":"ContainerStarted","Data":"873d37ee99fb511c8da26dd67c2f29770d664f57f955aa7d18b5dc0f234df076"} Feb 19 19:37:10 crc kubenswrapper[4722]: I0219 19:37:10.287429 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"13228713-9349-4241-b1f7-67f9a2c705fa","Type":"ContainerStarted","Data":"e05fde647fb1e3dd532c2cdd72bd74727b4395da4d5ee52337145b8d1d53bc36"} Feb 19 19:37:10 crc kubenswrapper[4722]: I0219 19:37:10.288953 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"05a27e5a-189e-4d17-9823-d95ef7906a7b","Type":"ContainerStarted","Data":"c7e351be424166d5d7820a88ce1149818ad63b6b821866688b2c8efa855c70f0"} Feb 19 19:37:10 crc kubenswrapper[4722]: I0219 19:37:10.324881 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=18.847774373 podStartE2EDuration="38.324864068s" podCreationTimestamp="2026-02-19 19:36:32 +0000 UTC" firstStartedPulling="2026-02-19 19:36:50.345498204 +0000 UTC m=+1109.957848518" lastFinishedPulling="2026-02-19 19:37:09.822587879 +0000 UTC m=+1129.434938213" observedRunningTime="2026-02-19 19:37:10.323904108 +0000 UTC m=+1129.936254422" watchObservedRunningTime="2026-02-19 19:37:10.324864068 +0000 UTC m=+1129.937214392" Feb 19 19:37:10 crc kubenswrapper[4722]: I0219 19:37:10.368337 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=17.477743775 podStartE2EDuration="34.36831922s" podCreationTimestamp="2026-02-19 19:36:36 +0000 UTC" firstStartedPulling="2026-02-19 19:36:52.92284877 +0000 UTC m=+1112.535199094" lastFinishedPulling="2026-02-19 19:37:09.813424215 +0000 UTC m=+1129.425774539" observedRunningTime="2026-02-19 19:37:10.363445458 +0000 UTC m=+1129.975795802" watchObservedRunningTime="2026-02-19 19:37:10.36831922 +0000 UTC m=+1129.980669534" Feb 19 19:37:11 crc kubenswrapper[4722]: I0219 19:37:11.158659 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 19 19:37:11 crc kubenswrapper[4722]: I0219 19:37:11.232831 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 19 19:37:11 crc kubenswrapper[4722]: I0219 19:37:11.297446 4722 generic.go:334] "Generic (PLEG): container finished" podID="a07f9633-74f5-48e5-8467-d649fc49a2ff" containerID="c4e1aaf5987ae7613ffce2d46a9e81da3905d6934748687233a4bb3c5c36ee1f" exitCode=0 Feb 19 19:37:11 crc kubenswrapper[4722]: I0219 19:37:11.297582 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a07f9633-74f5-48e5-8467-d649fc49a2ff","Type":"ContainerDied","Data":"c4e1aaf5987ae7613ffce2d46a9e81da3905d6934748687233a4bb3c5c36ee1f"} Feb 19 19:37:11 crc kubenswrapper[4722]: I0219 19:37:11.298407 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.250434 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.311191 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a07f9633-74f5-48e5-8467-d649fc49a2ff","Type":"ContainerStarted","Data":"8f4e164cf5a8efd63d40164ec3935ff84db296d1d35f8d95c33fe3839e08d122"} Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.359100 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371989.495707 podStartE2EDuration="47.359068934s" podCreationTimestamp="2026-02-19 19:36:25 +0000 UTC" firstStartedPulling="2026-02-19 19:36:49.210066374 +0000 UTC m=+1108.822416698" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:37:12.345275825 +0000 UTC m=+1131.957626169" watchObservedRunningTime="2026-02-19 19:37:12.359068934 +0000 UTC m=+1131.971419288" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.379675 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.599628 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.631907 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jmsp2"] Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.632686 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" podUID="17b6c8b5-9711-4601-a0fd-a1f528e97287" containerName="dnsmasq-dns" containerID="cri-o://3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8" gracePeriod=10 Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.654297 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-794vh"] Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.656025 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.662322 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.668683 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.675748 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-794vh"] Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.758292 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.758400 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzgmn\" (UniqueName: \"kubernetes.io/projected/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-kube-api-access-kzgmn\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.758520 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-config\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.758562 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.822500 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-282bs"] Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.830937 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.834005 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.860401 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzgmn\" (UniqueName: \"kubernetes.io/projected/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-kube-api-access-kzgmn\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.860604 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-config\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.860645 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.860703 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.862085 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-config\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.862328 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.862972 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.863086 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-282bs"] Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.918804 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzgmn\" (UniqueName: \"kubernetes.io/projected/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-kube-api-access-kzgmn\") pod \"dnsmasq-dns-7f896c8c65-794vh\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.962606 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9470e2b8-0f01-4735-8050-1bae363b3a02-config\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.962868 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9470e2b8-0f01-4735-8050-1bae363b3a02-combined-ca-bundle\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.962990 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbsn9\" (UniqueName: \"kubernetes.io/projected/9470e2b8-0f01-4735-8050-1bae363b3a02-kube-api-access-xbsn9\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.963110 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9470e2b8-0f01-4735-8050-1bae363b3a02-ovs-rundir\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.963212 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9470e2b8-0f01-4735-8050-1bae363b3a02-ovn-rundir\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:12 crc kubenswrapper[4722]: I0219 19:37:12.963287 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9470e2b8-0f01-4735-8050-1bae363b3a02-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.024704 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.064601 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9470e2b8-0f01-4735-8050-1bae363b3a02-config\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.065004 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9470e2b8-0f01-4735-8050-1bae363b3a02-combined-ca-bundle\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.065054 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbsn9\" (UniqueName: \"kubernetes.io/projected/9470e2b8-0f01-4735-8050-1bae363b3a02-kube-api-access-xbsn9\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.065137 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9470e2b8-0f01-4735-8050-1bae363b3a02-ovs-rundir\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.065190 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9470e2b8-0f01-4735-8050-1bae363b3a02-ovn-rundir\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.065225 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9470e2b8-0f01-4735-8050-1bae363b3a02-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.065653 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/9470e2b8-0f01-4735-8050-1bae363b3a02-ovs-rundir\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.066200 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/9470e2b8-0f01-4735-8050-1bae363b3a02-ovn-rundir\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.066720 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9470e2b8-0f01-4735-8050-1bae363b3a02-config\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.072857 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9470e2b8-0f01-4735-8050-1bae363b3a02-combined-ca-bundle\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.101424 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9470e2b8-0f01-4735-8050-1bae363b3a02-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.112746 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbsn9\" (UniqueName: \"kubernetes.io/projected/9470e2b8-0f01-4735-8050-1bae363b3a02-kube-api-access-xbsn9\") pod \"ovn-controller-metrics-282bs\" (UID: \"9470e2b8-0f01-4735-8050-1bae363b3a02\") " pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.147267 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-282bs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.174639 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.265004 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dt86l"] Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.265249 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" podUID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" containerName="dnsmasq-dns" containerID="cri-o://a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434" gracePeriod=10 Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.270923 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-dns-svc\") pod \"17b6c8b5-9711-4601-a0fd-a1f528e97287\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.271010 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpz6x\" (UniqueName: \"kubernetes.io/projected/17b6c8b5-9711-4601-a0fd-a1f528e97287-kube-api-access-wpz6x\") pod \"17b6c8b5-9711-4601-a0fd-a1f528e97287\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.271172 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-config\") pod \"17b6c8b5-9711-4601-a0fd-a1f528e97287\" (UID: \"17b6c8b5-9711-4601-a0fd-a1f528e97287\") " Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.280349 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17b6c8b5-9711-4601-a0fd-a1f528e97287-kube-api-access-wpz6x" (OuterVolumeSpecName: "kube-api-access-wpz6x") pod "17b6c8b5-9711-4601-a0fd-a1f528e97287" (UID: "17b6c8b5-9711-4601-a0fd-a1f528e97287"). InnerVolumeSpecName "kube-api-access-wpz6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.299432 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xtsln"] Feb 19 19:37:13 crc kubenswrapper[4722]: E0219 19:37:13.299843 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b6c8b5-9711-4601-a0fd-a1f528e97287" containerName="dnsmasq-dns" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.299855 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b6c8b5-9711-4601-a0fd-a1f528e97287" containerName="dnsmasq-dns" Feb 19 19:37:13 crc kubenswrapper[4722]: E0219 19:37:13.299880 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b6c8b5-9711-4601-a0fd-a1f528e97287" containerName="init" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.299887 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b6c8b5-9711-4601-a0fd-a1f528e97287" containerName="init" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.300038 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="17b6c8b5-9711-4601-a0fd-a1f528e97287" containerName="dnsmasq-dns" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.301075 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.305166 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.337891 4722 generic.go:334] "Generic (PLEG): container finished" podID="17b6c8b5-9711-4601-a0fd-a1f528e97287" containerID="3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8" exitCode=0 Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.337969 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" event={"ID":"17b6c8b5-9711-4601-a0fd-a1f528e97287","Type":"ContainerDied","Data":"3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8"} Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.338017 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" event={"ID":"17b6c8b5-9711-4601-a0fd-a1f528e97287","Type":"ContainerDied","Data":"967b9f32e64a62fdd1e64949dcea547f81e3dffcf01eb6300e42284f5721c31d"} Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.338033 4722 scope.go:117] "RemoveContainer" containerID="3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.338187 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jmsp2" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.347496 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "17b6c8b5-9711-4601-a0fd-a1f528e97287" (UID: "17b6c8b5-9711-4601-a0fd-a1f528e97287"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.348402 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xtsln"] Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.350026 4722 generic.go:334] "Generic (PLEG): container finished" podID="53444e7f-4c1d-401b-9896-5ff9c4aab65a" containerID="13563db377a2d356d6c5e051100eb3fdec737b3d97a75f97173241fd5519e50d" exitCode=0 Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.351314 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"53444e7f-4c1d-401b-9896-5ff9c4aab65a","Type":"ContainerDied","Data":"13563db377a2d356d6c5e051100eb3fdec737b3d97a75f97173241fd5519e50d"} Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.351345 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.372612 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw227\" (UniqueName: \"kubernetes.io/projected/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-kube-api-access-cw227\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.372698 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.372756 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.372772 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.372815 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-config\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.372864 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.372875 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpz6x\" (UniqueName: \"kubernetes.io/projected/17b6c8b5-9711-4601-a0fd-a1f528e97287-kube-api-access-wpz6x\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.420201 4722 scope.go:117] "RemoveContainer" containerID="a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.420405 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.423386 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-config" (OuterVolumeSpecName: "config") pod "17b6c8b5-9711-4601-a0fd-a1f528e97287" (UID: "17b6c8b5-9711-4601-a0fd-a1f528e97287"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.474330 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.474478 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.474504 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.474620 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-config\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.474661 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw227\" (UniqueName: \"kubernetes.io/projected/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-kube-api-access-cw227\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.474831 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17b6c8b5-9711-4601-a0fd-a1f528e97287-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.475580 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.475667 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.475762 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.477515 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-config\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.485753 4722 scope.go:117] "RemoveContainer" containerID="3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8" Feb 19 19:37:13 crc kubenswrapper[4722]: E0219 19:37:13.491464 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8\": container with ID starting with 3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8 not found: ID does not exist" containerID="3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.491507 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8"} err="failed to get container status \"3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8\": rpc error: code = NotFound desc = could not find container \"3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8\": container with ID starting with 3cdad126448a39a8b1ddf90e43a46afa7c142c965c57ee83d2d8df359f96aef8 not found: ID does not exist" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.491537 4722 scope.go:117] "RemoveContainer" containerID="a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1" Feb 19 19:37:13 crc kubenswrapper[4722]: E0219 19:37:13.491823 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1\": container with ID starting with a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1 not found: ID does not exist" containerID="a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.491854 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1"} err="failed to get container status \"a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1\": rpc error: code = NotFound desc = could not find container \"a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1\": container with ID starting with a4567d54e40db4a121620d0651f9f61ac38d561d4296882e40fbed9e7f8bb2f1 not found: ID does not exist" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.493051 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw227\" (UniqueName: \"kubernetes.io/projected/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-kube-api-access-cw227\") pod \"dnsmasq-dns-86db49b7ff-xtsln\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.625787 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-794vh"] Feb 19 19:37:13 crc kubenswrapper[4722]: W0219 19:37:13.627633 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec94b2bc_5f45_48da_aaba_8bd3e5c8e29b.slice/crio-ac0471b72b28ebe62e8e5340675d6bb6dfa27b0ec4d4c1d82ec67fc0c6b1fab9 WatchSource:0}: Error finding container ac0471b72b28ebe62e8e5340675d6bb6dfa27b0ec4d4c1d82ec67fc0c6b1fab9: Status 404 returned error can't find the container with id ac0471b72b28ebe62e8e5340675d6bb6dfa27b0ec4d4c1d82ec67fc0c6b1fab9 Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.650075 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.761939 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jmsp2"] Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.783260 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jmsp2"] Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.792759 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.796217 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.799465 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-7rpl2" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.800048 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.801598 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.801858 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.807734 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.847980 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-282bs"] Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.886106 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f8e6f58-f989-41f2-b8cb-c798405cfa33-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.886585 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f8e6f58-f989-41f2-b8cb-c798405cfa33-scripts\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.886631 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f8e6f58-f989-41f2-b8cb-c798405cfa33-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.886660 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f8e6f58-f989-41f2-b8cb-c798405cfa33-config\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.886822 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnddz\" (UniqueName: \"kubernetes.io/projected/6f8e6f58-f989-41f2-b8cb-c798405cfa33-kube-api-access-fnddz\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.888258 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6f8e6f58-f989-41f2-b8cb-c798405cfa33-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.888286 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8e6f58-f989-41f2-b8cb-c798405cfa33-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.988023 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.990662 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6f8e6f58-f989-41f2-b8cb-c798405cfa33-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.991177 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6f8e6f58-f989-41f2-b8cb-c798405cfa33-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.991213 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8e6f58-f989-41f2-b8cb-c798405cfa33-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.991280 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f8e6f58-f989-41f2-b8cb-c798405cfa33-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.992015 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f8e6f58-f989-41f2-b8cb-c798405cfa33-scripts\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.992068 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f8e6f58-f989-41f2-b8cb-c798405cfa33-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.992088 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f8e6f58-f989-41f2-b8cb-c798405cfa33-config\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.992141 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnddz\" (UniqueName: \"kubernetes.io/projected/6f8e6f58-f989-41f2-b8cb-c798405cfa33-kube-api-access-fnddz\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.994571 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f8e6f58-f989-41f2-b8cb-c798405cfa33-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.994665 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f8e6f58-f989-41f2-b8cb-c798405cfa33-config\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.994810 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f8e6f58-f989-41f2-b8cb-c798405cfa33-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:13 crc kubenswrapper[4722]: I0219 19:37:13.994990 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f8e6f58-f989-41f2-b8cb-c798405cfa33-scripts\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.010571 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f8e6f58-f989-41f2-b8cb-c798405cfa33-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.021582 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnddz\" (UniqueName: \"kubernetes.io/projected/6f8e6f58-f989-41f2-b8cb-c798405cfa33-kube-api-access-fnddz\") pod \"ovn-northd-0\" (UID: \"6f8e6f58-f989-41f2-b8cb-c798405cfa33\") " pod="openstack/ovn-northd-0" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.093058 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-config\") pod \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.096353 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf2km\" (UniqueName: \"kubernetes.io/projected/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-kube-api-access-kf2km\") pod \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.096436 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-dns-svc\") pod \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\" (UID: \"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465\") " Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.102189 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-kube-api-access-kf2km" (OuterVolumeSpecName: "kube-api-access-kf2km") pod "4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" (UID: "4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465"). InnerVolumeSpecName "kube-api-access-kf2km". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.137208 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-config" (OuterVolumeSpecName: "config") pod "4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" (UID: "4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.158823 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" (UID: "4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.198480 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.198514 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf2km\" (UniqueName: \"kubernetes.io/projected/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-kube-api-access-kf2km\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.198527 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.207931 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xtsln"] Feb 19 19:37:14 crc kubenswrapper[4722]: W0219 19:37:14.214841 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfec288d_0744_48b4_8fcb_9ba349ebb6c4.slice/crio-b3bc6e7378e1e488a78e74c0d758a8daf6d37020e25d79f7587034900624dac5 WatchSource:0}: Error finding container b3bc6e7378e1e488a78e74c0d758a8daf6d37020e25d79f7587034900624dac5: Status 404 returned error can't find the container with id b3bc6e7378e1e488a78e74c0d758a8daf6d37020e25d79f7587034900624dac5 Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.285216 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.360063 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" event={"ID":"dfec288d-0744-48b4-8fcb-9ba349ebb6c4","Type":"ContainerStarted","Data":"b3bc6e7378e1e488a78e74c0d758a8daf6d37020e25d79f7587034900624dac5"} Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.365741 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-282bs" event={"ID":"9470e2b8-0f01-4735-8050-1bae363b3a02","Type":"ContainerStarted","Data":"cc076b1f283aca77291d9912f5fc1dc5f832fcaa4d23e0d84625301b815b2d2a"} Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.365873 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-282bs" event={"ID":"9470e2b8-0f01-4735-8050-1bae363b3a02","Type":"ContainerStarted","Data":"475eba7baafd2537d18af499574b953cb465d02e6ead97fc6eef6dffd274ae95"} Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.370257 4722 generic.go:334] "Generic (PLEG): container finished" podID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" containerID="a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434" exitCode=0 Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.370469 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.370654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" event={"ID":"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465","Type":"ContainerDied","Data":"a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434"} Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.370980 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-dt86l" event={"ID":"4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465","Type":"ContainerDied","Data":"f69d565a432964650b0242acf131b696304e0220fdb9a6380a634faf46d56f00"} Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.371001 4722 scope.go:117] "RemoveContainer" containerID="a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.384422 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"53444e7f-4c1d-401b-9896-5ff9c4aab65a","Type":"ContainerStarted","Data":"c497eb715405c49a41ac9c6c19dd91ccc2639f412bb662f551950a2a6a4f9593"} Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.387248 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-282bs" podStartSLOduration=2.387231502 podStartE2EDuration="2.387231502s" podCreationTimestamp="2026-02-19 19:37:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:37:14.384082544 +0000 UTC m=+1133.996432878" watchObservedRunningTime="2026-02-19 19:37:14.387231502 +0000 UTC m=+1133.999581826" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.407353 4722 generic.go:334] "Generic (PLEG): container finished" podID="ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" containerID="a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94" exitCode=0 Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.407587 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" event={"ID":"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b","Type":"ContainerDied","Data":"a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94"} Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.407645 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" event={"ID":"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b","Type":"ContainerStarted","Data":"ac0471b72b28ebe62e8e5340675d6bb6dfa27b0ec4d4c1d82ec67fc0c6b1fab9"} Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.426443 4722 scope.go:117] "RemoveContainer" containerID="99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.428232 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=33.894086704 podStartE2EDuration="50.428214397s" podCreationTimestamp="2026-02-19 19:36:24 +0000 UTC" firstStartedPulling="2026-02-19 19:36:49.315224916 +0000 UTC m=+1108.927575230" lastFinishedPulling="2026-02-19 19:37:05.849352599 +0000 UTC m=+1125.461702923" observedRunningTime="2026-02-19 19:37:14.426799814 +0000 UTC m=+1134.039150138" watchObservedRunningTime="2026-02-19 19:37:14.428214397 +0000 UTC m=+1134.040564721" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.468443 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dt86l"] Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.487315 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-dt86l"] Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.514455 4722 scope.go:117] "RemoveContainer" containerID="a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434" Feb 19 19:37:14 crc kubenswrapper[4722]: E0219 19:37:14.514802 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434\": container with ID starting with a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434 not found: ID does not exist" containerID="a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.514829 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434"} err="failed to get container status \"a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434\": rpc error: code = NotFound desc = could not find container \"a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434\": container with ID starting with a6d8877a45223562444eac35fdb9035e578955d03edabd6f916e1919540b7434 not found: ID does not exist" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.514848 4722 scope.go:117] "RemoveContainer" containerID="99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97" Feb 19 19:37:14 crc kubenswrapper[4722]: E0219 19:37:14.515014 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97\": container with ID starting with 99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97 not found: ID does not exist" containerID="99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.515033 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97"} err="failed to get container status \"99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97\": rpc error: code = NotFound desc = could not find container \"99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97\": container with ID starting with 99d916fe232c160ee40b09ff7a89b250ed8d6f070a42a5961e616d61ae3c9a97 not found: ID does not exist" Feb 19 19:37:14 crc kubenswrapper[4722]: I0219 19:37:14.768916 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 19:37:14 crc kubenswrapper[4722]: W0219 19:37:14.771170 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f8e6f58_f989_41f2_b8cb_c798405cfa33.slice/crio-9cfd7e086843143d6ce602ef5bdaf1dac3de6a22763cea3593a35f0ae91f5864 WatchSource:0}: Error finding container 9cfd7e086843143d6ce602ef5bdaf1dac3de6a22763cea3593a35f0ae91f5864: Status 404 returned error can't find the container with id 9cfd7e086843143d6ce602ef5bdaf1dac3de6a22763cea3593a35f0ae91f5864 Feb 19 19:37:15 crc kubenswrapper[4722]: I0219 19:37:15.084357 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17b6c8b5-9711-4601-a0fd-a1f528e97287" path="/var/lib/kubelet/pods/17b6c8b5-9711-4601-a0fd-a1f528e97287/volumes" Feb 19 19:37:15 crc kubenswrapper[4722]: I0219 19:37:15.085656 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" path="/var/lib/kubelet/pods/4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465/volumes" Feb 19 19:37:15 crc kubenswrapper[4722]: I0219 19:37:15.428690 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6f8e6f58-f989-41f2-b8cb-c798405cfa33","Type":"ContainerStarted","Data":"9cfd7e086843143d6ce602ef5bdaf1dac3de6a22763cea3593a35f0ae91f5864"} Feb 19 19:37:15 crc kubenswrapper[4722]: I0219 19:37:15.430492 4722 generic.go:334] "Generic (PLEG): container finished" podID="dfec288d-0744-48b4-8fcb-9ba349ebb6c4" containerID="62cc34e349902eca38fc94fdcd77006a8905ea0cb9cbb3392c7d1c40da4629fc" exitCode=0 Feb 19 19:37:15 crc kubenswrapper[4722]: I0219 19:37:15.430552 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" event={"ID":"dfec288d-0744-48b4-8fcb-9ba349ebb6c4","Type":"ContainerDied","Data":"62cc34e349902eca38fc94fdcd77006a8905ea0cb9cbb3392c7d1c40da4629fc"} Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.063968 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.064344 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.442580 4722 generic.go:334] "Generic (PLEG): container finished" podID="78e7f1b1-be76-4f05-bd63-ff87b440e173" containerID="873d37ee99fb511c8da26dd67c2f29770d664f57f955aa7d18b5dc0f234df076" exitCode=0 Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.442825 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"78e7f1b1-be76-4f05-bd63-ff87b440e173","Type":"ContainerDied","Data":"873d37ee99fb511c8da26dd67c2f29770d664f57f955aa7d18b5dc0f234df076"} Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.449394 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" event={"ID":"dfec288d-0744-48b4-8fcb-9ba349ebb6c4","Type":"ContainerStarted","Data":"044ff08c5dbbd2f41c731beab45cb688557289abbb1920032c7fa0385f11e9f7"} Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.450817 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.453375 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" event={"ID":"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b","Type":"ContainerStarted","Data":"6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862"} Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.453803 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.455186 4722 generic.go:334] "Generic (PLEG): container finished" podID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerID="a57f0a1057a7622bf6cd5a97f7d1c754dd0d44986fc9d7f455890c4bc7caac51" exitCode=0 Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.455210 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a08df2e8-3f03-4e9c-91cf-2890026b9d76","Type":"ContainerDied","Data":"a57f0a1057a7622bf6cd5a97f7d1c754dd0d44986fc9d7f455890c4bc7caac51"} Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.519354 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" podStartSLOduration=3.519337954 podStartE2EDuration="3.519337954s" podCreationTimestamp="2026-02-19 19:37:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:37:16.510749257 +0000 UTC m=+1136.123099581" watchObservedRunningTime="2026-02-19 19:37:16.519337954 +0000 UTC m=+1136.131688268" Feb 19 19:37:16 crc kubenswrapper[4722]: I0219 19:37:16.531833 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" podStartSLOduration=4.531811993 podStartE2EDuration="4.531811993s" podCreationTimestamp="2026-02-19 19:37:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:37:16.529533852 +0000 UTC m=+1136.141884186" watchObservedRunningTime="2026-02-19 19:37:16.531811993 +0000 UTC m=+1136.144162317" Feb 19 19:37:17 crc kubenswrapper[4722]: I0219 19:37:17.465500 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6f8e6f58-f989-41f2-b8cb-c798405cfa33","Type":"ContainerStarted","Data":"ccd7812cf29dbb3027cfbe13bca2d67873c62d4c527ff9b258bd53d7c9b1855c"} Feb 19 19:37:17 crc kubenswrapper[4722]: I0219 19:37:17.465827 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6f8e6f58-f989-41f2-b8cb-c798405cfa33","Type":"ContainerStarted","Data":"3a29aa6cae244726865ab666544baaa48e5eeb85a103b154e2fa0dfe70455672"} Feb 19 19:37:17 crc kubenswrapper[4722]: I0219 19:37:17.481107 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 19 19:37:17 crc kubenswrapper[4722]: I0219 19:37:17.481405 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 19 19:37:17 crc kubenswrapper[4722]: I0219 19:37:17.489792 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.848085668 podStartE2EDuration="4.48977317s" podCreationTimestamp="2026-02-19 19:37:13 +0000 UTC" firstStartedPulling="2026-02-19 19:37:14.773846902 +0000 UTC m=+1134.386197226" lastFinishedPulling="2026-02-19 19:37:16.415534414 +0000 UTC m=+1136.027884728" observedRunningTime="2026-02-19 19:37:17.483945699 +0000 UTC m=+1137.096296093" watchObservedRunningTime="2026-02-19 19:37:17.48977317 +0000 UTC m=+1137.102123504" Feb 19 19:37:18 crc kubenswrapper[4722]: I0219 19:37:18.480128 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" event={"ID":"fc37f35d-ac2f-40a0-90e1-40c3b80b1782","Type":"ContainerStarted","Data":"86abf35ae7857af2662ed6f4bce3b72afb340a5c2b54a1e546605550ce412da0"} Feb 19 19:37:18 crc kubenswrapper[4722]: I0219 19:37:18.480999 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:37:18 crc kubenswrapper[4722]: I0219 19:37:18.482281 4722 generic.go:334] "Generic (PLEG): container finished" podID="c8300e35-4c72-4398-9058-0aa76005d576" containerID="663d74fba80dd334e859cdb2cd8cfc2f7c0c52291c1501db2186e2522941c439" exitCode=0 Feb 19 19:37:18 crc kubenswrapper[4722]: I0219 19:37:18.482369 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fwvrs" event={"ID":"c8300e35-4c72-4398-9058-0aa76005d576","Type":"ContainerDied","Data":"663d74fba80dd334e859cdb2cd8cfc2f7c0c52291c1501db2186e2522941c439"} Feb 19 19:37:18 crc kubenswrapper[4722]: I0219 19:37:18.483007 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 19 19:37:18 crc kubenswrapper[4722]: I0219 19:37:18.502427 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" Feb 19 19:37:18 crc kubenswrapper[4722]: I0219 19:37:18.506059 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-qxjk2" podStartSLOduration=10.302538347 podStartE2EDuration="38.506039102s" podCreationTimestamp="2026-02-19 19:36:40 +0000 UTC" firstStartedPulling="2026-02-19 19:36:49.946300432 +0000 UTC m=+1109.558650756" lastFinishedPulling="2026-02-19 19:37:18.149801187 +0000 UTC m=+1137.762151511" observedRunningTime="2026-02-19 19:37:18.50564163 +0000 UTC m=+1138.117991974" watchObservedRunningTime="2026-02-19 19:37:18.506039102 +0000 UTC m=+1138.118389426" Feb 19 19:37:18 crc kubenswrapper[4722]: I0219 19:37:18.777327 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 19 19:37:18 crc kubenswrapper[4722]: I0219 19:37:18.920395 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.315731 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-794vh"] Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.347647 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-jvpfv"] Feb 19 19:37:19 crc kubenswrapper[4722]: E0219 19:37:19.349355 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" containerName="init" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.349396 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" containerName="init" Feb 19 19:37:19 crc kubenswrapper[4722]: E0219 19:37:19.349432 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" containerName="dnsmasq-dns" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.349443 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" containerName="dnsmasq-dns" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.349733 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d3acaf5-2a64-4dbd-8a74-f4e10cbd5465" containerName="dnsmasq-dns" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.350888 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.384768 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jvpfv"] Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.498669 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"14a7aae0-6a51-49ed-b4dd-9b274885d1da","Type":"ContainerStarted","Data":"3a2f38c278decbb381ff361931bea01935f3b90be53c0932153ee1cc0d0759f2"} Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.498908 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.506497 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fwvrs" event={"ID":"c8300e35-4c72-4398-9058-0aa76005d576","Type":"ContainerStarted","Data":"72b185aa4e6327015ed4749ec09a7270a569c75eac7f0279641fa9858918a81e"} Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.506920 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" podUID="ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" containerName="dnsmasq-dns" containerID="cri-o://6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862" gracePeriod=10 Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.515734 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=22.891697779 podStartE2EDuration="51.515716689s" podCreationTimestamp="2026-02-19 19:36:28 +0000 UTC" firstStartedPulling="2026-02-19 19:36:49.948352416 +0000 UTC m=+1109.560702740" lastFinishedPulling="2026-02-19 19:37:18.572371326 +0000 UTC m=+1138.184721650" observedRunningTime="2026-02-19 19:37:19.515525333 +0000 UTC m=+1139.127875657" watchObservedRunningTime="2026-02-19 19:37:19.515716689 +0000 UTC m=+1139.128067013" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.526276 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-config\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.526333 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-dns-svc\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.526460 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs5nx\" (UniqueName: \"kubernetes.io/projected/b12e3334-cc75-47af-870a-3d86164cb249-kube-api-access-bs5nx\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.526508 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.526531 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.627959 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.628041 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.628182 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-config\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.628219 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-dns-svc\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.628389 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs5nx\" (UniqueName: \"kubernetes.io/projected/b12e3334-cc75-47af-870a-3d86164cb249-kube-api-access-bs5nx\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.629708 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-dns-svc\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.630010 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.630099 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.631428 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-config\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.657903 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs5nx\" (UniqueName: \"kubernetes.io/projected/b12e3334-cc75-47af-870a-3d86164cb249-kube-api-access-bs5nx\") pod \"dnsmasq-dns-698758b865-jvpfv\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.670775 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.785212 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 19 19:37:19 crc kubenswrapper[4722]: I0219 19:37:19.872228 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.363821 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.486966 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 19 19:37:20 crc kubenswrapper[4722]: E0219 19:37:20.488425 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" containerName="init" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.488451 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" containerName="init" Feb 19 19:37:20 crc kubenswrapper[4722]: E0219 19:37:20.488489 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" containerName="dnsmasq-dns" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.488498 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" containerName="dnsmasq-dns" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.489563 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" containerName="dnsmasq-dns" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.532805 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.532944 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.535980 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.536387 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.536439 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.540542 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-2g4z6" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.546029 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"78e7f1b1-be76-4f05-bd63-ff87b440e173","Type":"ContainerStarted","Data":"95da56850b2ea7e8730ab7324347b55ee33efb05afb659de11188f775ecfa216"} Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.547909 4722 generic.go:334] "Generic (PLEG): container finished" podID="ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" containerID="6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862" exitCode=0 Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.548894 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.548984 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" event={"ID":"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b","Type":"ContainerDied","Data":"6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862"} Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.549015 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-794vh" event={"ID":"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b","Type":"ContainerDied","Data":"ac0471b72b28ebe62e8e5340675d6bb6dfa27b0ec4d4c1d82ec67fc0c6b1fab9"} Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.549034 4722 scope.go:117] "RemoveContainer" containerID="6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.549299 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-ovsdbserver-sb\") pod \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.549378 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-dns-svc\") pod \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.550723 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzgmn\" (UniqueName: \"kubernetes.io/projected/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-kube-api-access-kzgmn\") pod \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.550920 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-config\") pod \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\" (UID: \"ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b\") " Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.555720 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-kube-api-access-kzgmn" (OuterVolumeSpecName: "kube-api-access-kzgmn") pod "ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" (UID: "ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b"). InnerVolumeSpecName "kube-api-access-kzgmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.607319 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-config" (OuterVolumeSpecName: "config") pod "ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" (UID: "ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.615840 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" (UID: "ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.622768 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" (UID: "ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.628139 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jvpfv"] Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.652791 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v57xk\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-kube-api-access-v57xk\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.652839 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/98dc74a5-9538-49e4-9dd0-eb2735f18d41-cache\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.652940 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.653023 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ab7f4fdc-0e31-4c25-ad01-a1c8200d60f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab7f4fdc-0e31-4c25-ad01-a1c8200d60f3\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.653046 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98dc74a5-9538-49e4-9dd0-eb2735f18d41-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.653133 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/98dc74a5-9538-49e4-9dd0-eb2735f18d41-lock\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.653223 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzgmn\" (UniqueName: \"kubernetes.io/projected/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-kube-api-access-kzgmn\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.653240 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.653251 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.653261 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.755061 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ab7f4fdc-0e31-4c25-ad01-a1c8200d60f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab7f4fdc-0e31-4c25-ad01-a1c8200d60f3\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.755127 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98dc74a5-9538-49e4-9dd0-eb2735f18d41-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.755263 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/98dc74a5-9538-49e4-9dd0-eb2735f18d41-lock\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.755954 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/98dc74a5-9538-49e4-9dd0-eb2735f18d41-lock\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.756121 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v57xk\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-kube-api-access-v57xk\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.756182 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/98dc74a5-9538-49e4-9dd0-eb2735f18d41-cache\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.756265 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: E0219 19:37:20.756454 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 19:37:20 crc kubenswrapper[4722]: E0219 19:37:20.756473 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 19:37:20 crc kubenswrapper[4722]: E0219 19:37:20.756520 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift podName:98dc74a5-9538-49e4-9dd0-eb2735f18d41 nodeName:}" failed. No retries permitted until 2026-02-19 19:37:21.256500557 +0000 UTC m=+1140.868850881 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift") pod "swift-storage-0" (UID: "98dc74a5-9538-49e4-9dd0-eb2735f18d41") : configmap "swift-ring-files" not found Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.757051 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/98dc74a5-9538-49e4-9dd0-eb2735f18d41-cache\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.760281 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98dc74a5-9538-49e4-9dd0-eb2735f18d41-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.763480 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.763516 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ab7f4fdc-0e31-4c25-ad01-a1c8200d60f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab7f4fdc-0e31-4c25-ad01-a1c8200d60f3\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/603a810ec5db859f322e09f708b74d3e59133a63d24627063418a6e2b2532b88/globalmount\"" pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.778686 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v57xk\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-kube-api-access-v57xk\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.796957 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ab7f4fdc-0e31-4c25-ad01-a1c8200d60f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab7f4fdc-0e31-4c25-ad01-a1c8200d60f3\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.885384 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-794vh"] Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.892879 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-794vh"] Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.931743 4722 scope.go:117] "RemoveContainer" containerID="a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94" Feb 19 19:37:20 crc kubenswrapper[4722]: W0219 19:37:20.936266 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb12e3334_cc75_47af_870a_3d86164cb249.slice/crio-1028e5969f0d7dbc2c219bf0143cef7647b9346e4f36673b7b607399975bc325 WatchSource:0}: Error finding container 1028e5969f0d7dbc2c219bf0143cef7647b9346e4f36673b7b607399975bc325: Status 404 returned error can't find the container with id 1028e5969f0d7dbc2c219bf0143cef7647b9346e4f36673b7b607399975bc325 Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.952699 4722 scope.go:117] "RemoveContainer" containerID="6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862" Feb 19 19:37:20 crc kubenswrapper[4722]: E0219 19:37:20.953137 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862\": container with ID starting with 6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862 not found: ID does not exist" containerID="6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.953189 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862"} err="failed to get container status \"6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862\": rpc error: code = NotFound desc = could not find container \"6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862\": container with ID starting with 6041f17958aa8380ee86c6e83b57f41e2759c685e79c43861e9067e7d3579862 not found: ID does not exist" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.953223 4722 scope.go:117] "RemoveContainer" containerID="a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94" Feb 19 19:37:20 crc kubenswrapper[4722]: E0219 19:37:20.953570 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94\": container with ID starting with a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94 not found: ID does not exist" containerID="a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94" Feb 19 19:37:20 crc kubenswrapper[4722]: I0219 19:37:20.953636 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94"} err="failed to get container status \"a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94\": rpc error: code = NotFound desc = could not find container \"a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94\": container with ID starting with a51867bc17c6750cae3e9367bd227390adeccad1082b3257145b11a61cea3a94 not found: ID does not exist" Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.101810 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b" path="/var/lib/kubelet/pods/ec94b2bc-5f45-48da-aaba-8bd3e5c8e29b/volumes" Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.270614 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:21 crc kubenswrapper[4722]: E0219 19:37:21.271061 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 19:37:21 crc kubenswrapper[4722]: E0219 19:37:21.271077 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 19:37:21 crc kubenswrapper[4722]: E0219 19:37:21.271120 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift podName:98dc74a5-9538-49e4-9dd0-eb2735f18d41 nodeName:}" failed. No retries permitted until 2026-02-19 19:37:22.271106519 +0000 UTC m=+1141.883456833 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift") pod "swift-storage-0" (UID: "98dc74a5-9538-49e4-9dd0-eb2735f18d41") : configmap "swift-ring-files" not found Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.559367 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" event={"ID":"47cbe0b4-7d45-486b-9e9b-964db524e7ab","Type":"ContainerStarted","Data":"96dae1aa32b302d02a862017213278a24313e48e8ece75a386cfd1ad66863741"} Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.560358 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.564095 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-fwvrs" event={"ID":"c8300e35-4c72-4398-9058-0aa76005d576","Type":"ContainerStarted","Data":"09dc4a18c8e5633c815e20e2a05190540e7081aa9ea19714f54e4871e3e23e07"} Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.564259 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.570031 4722 generic.go:334] "Generic (PLEG): container finished" podID="b12e3334-cc75-47af-870a-3d86164cb249" containerID="58f8459d38255bc0ee2a3b1d7c9b5ab8e43bfd9e3de2e5dd8ef6021c2a7233ed" exitCode=0 Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.570070 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jvpfv" event={"ID":"b12e3334-cc75-47af-870a-3d86164cb249","Type":"ContainerDied","Data":"58f8459d38255bc0ee2a3b1d7c9b5ab8e43bfd9e3de2e5dd8ef6021c2a7233ed"} Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.570093 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jvpfv" event={"ID":"b12e3334-cc75-47af-870a-3d86164cb249","Type":"ContainerStarted","Data":"1028e5969f0d7dbc2c219bf0143cef7647b9346e4f36673b7b607399975bc325"} Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.572794 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.582023 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-2j29g" podStartSLOduration=11.877379129 podStartE2EDuration="41.581972752s" podCreationTimestamp="2026-02-19 19:36:40 +0000 UTC" firstStartedPulling="2026-02-19 19:36:50.429975292 +0000 UTC m=+1110.042325616" lastFinishedPulling="2026-02-19 19:37:20.134568905 +0000 UTC m=+1139.746919239" observedRunningTime="2026-02-19 19:37:21.577877054 +0000 UTC m=+1141.190227388" watchObservedRunningTime="2026-02-19 19:37:21.581972752 +0000 UTC m=+1141.194323066" Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.899839 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 19 19:37:21 crc kubenswrapper[4722]: I0219 19:37:21.917360 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-fwvrs" podStartSLOduration=22.132110712 podStartE2EDuration="48.917345877s" podCreationTimestamp="2026-02-19 19:36:33 +0000 UTC" firstStartedPulling="2026-02-19 19:36:50.696453564 +0000 UTC m=+1110.308803898" lastFinishedPulling="2026-02-19 19:37:17.481688699 +0000 UTC m=+1137.094039063" observedRunningTime="2026-02-19 19:37:21.669481135 +0000 UTC m=+1141.281831459" watchObservedRunningTime="2026-02-19 19:37:21.917345877 +0000 UTC m=+1141.529696201" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.298485 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:22 crc kubenswrapper[4722]: E0219 19:37:22.298727 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 19:37:22 crc kubenswrapper[4722]: E0219 19:37:22.298889 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 19:37:22 crc kubenswrapper[4722]: E0219 19:37:22.298946 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift podName:98dc74a5-9538-49e4-9dd0-eb2735f18d41 nodeName:}" failed. No retries permitted until 2026-02-19 19:37:24.298931561 +0000 UTC m=+1143.911281885 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift") pod "swift-storage-0" (UID: "98dc74a5-9538-49e4-9dd0-eb2735f18d41") : configmap "swift-ring-files" not found Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.500120 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1f02-account-create-update-cslgg"] Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.501202 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1f02-account-create-update-cslgg" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.503544 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.516899 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1f02-account-create-update-cslgg"] Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.542554 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-5m87g"] Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.544255 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5m87g" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.548564 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5m87g"] Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.580097 4722 generic.go:334] "Generic (PLEG): container finished" podID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" containerID="c749648f12e8840f28b25f37f34a53275ed4fc33d82900da005066210acf9af2" exitCode=0 Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.580189 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45","Type":"ContainerDied","Data":"c749648f12e8840f28b25f37f34a53275ed4fc33d82900da005066210acf9af2"} Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.582952 4722 generic.go:334] "Generic (PLEG): container finished" podID="75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" containerID="e127436a9b7fd84ddf258ebc3a3c64c5ddb9a7269490c5535eccdc44ec44422d" exitCode=0 Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.583006 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f","Type":"ContainerDied","Data":"e127436a9b7fd84ddf258ebc3a3c64c5ddb9a7269490c5535eccdc44ec44422d"} Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.585251 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jvpfv" event={"ID":"b12e3334-cc75-47af-870a-3d86164cb249","Type":"ContainerStarted","Data":"acdda2995a7c01c2bb56033df969b7a728b55472a6cb4f9472db1062d35bc9c3"} Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.585550 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.607298 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03387e77-59d8-4377-9a1c-dac948d84b59-operator-scripts\") pod \"glance-1f02-account-create-update-cslgg\" (UID: \"03387e77-59d8-4377-9a1c-dac948d84b59\") " pod="openstack/glance-1f02-account-create-update-cslgg" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.607381 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j4s2\" (UniqueName: \"kubernetes.io/projected/03387e77-59d8-4377-9a1c-dac948d84b59-kube-api-access-8j4s2\") pod \"glance-1f02-account-create-update-cslgg\" (UID: \"03387e77-59d8-4377-9a1c-dac948d84b59\") " pod="openstack/glance-1f02-account-create-update-cslgg" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.652033 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-jvpfv" podStartSLOduration=3.652015328 podStartE2EDuration="3.652015328s" podCreationTimestamp="2026-02-19 19:37:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:37:22.651826912 +0000 UTC m=+1142.264177246" watchObservedRunningTime="2026-02-19 19:37:22.652015328 +0000 UTC m=+1142.264365662" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.713315 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03387e77-59d8-4377-9a1c-dac948d84b59-operator-scripts\") pod \"glance-1f02-account-create-update-cslgg\" (UID: \"03387e77-59d8-4377-9a1c-dac948d84b59\") " pod="openstack/glance-1f02-account-create-update-cslgg" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.713418 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f940a76-c93f-46c5-af29-5b098a54adc8-operator-scripts\") pod \"glance-db-create-5m87g\" (UID: \"1f940a76-c93f-46c5-af29-5b098a54adc8\") " pod="openstack/glance-db-create-5m87g" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.713497 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j4s2\" (UniqueName: \"kubernetes.io/projected/03387e77-59d8-4377-9a1c-dac948d84b59-kube-api-access-8j4s2\") pod \"glance-1f02-account-create-update-cslgg\" (UID: \"03387e77-59d8-4377-9a1c-dac948d84b59\") " pod="openstack/glance-1f02-account-create-update-cslgg" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.714475 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88jsr\" (UniqueName: \"kubernetes.io/projected/1f940a76-c93f-46c5-af29-5b098a54adc8-kube-api-access-88jsr\") pod \"glance-db-create-5m87g\" (UID: \"1f940a76-c93f-46c5-af29-5b098a54adc8\") " pod="openstack/glance-db-create-5m87g" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.715680 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03387e77-59d8-4377-9a1c-dac948d84b59-operator-scripts\") pod \"glance-1f02-account-create-update-cslgg\" (UID: \"03387e77-59d8-4377-9a1c-dac948d84b59\") " pod="openstack/glance-1f02-account-create-update-cslgg" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.737831 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j4s2\" (UniqueName: \"kubernetes.io/projected/03387e77-59d8-4377-9a1c-dac948d84b59-kube-api-access-8j4s2\") pod \"glance-1f02-account-create-update-cslgg\" (UID: \"03387e77-59d8-4377-9a1c-dac948d84b59\") " pod="openstack/glance-1f02-account-create-update-cslgg" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.817237 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88jsr\" (UniqueName: \"kubernetes.io/projected/1f940a76-c93f-46c5-af29-5b098a54adc8-kube-api-access-88jsr\") pod \"glance-db-create-5m87g\" (UID: \"1f940a76-c93f-46c5-af29-5b098a54adc8\") " pod="openstack/glance-db-create-5m87g" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.817368 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f940a76-c93f-46c5-af29-5b098a54adc8-operator-scripts\") pod \"glance-db-create-5m87g\" (UID: \"1f940a76-c93f-46c5-af29-5b098a54adc8\") " pod="openstack/glance-db-create-5m87g" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.818358 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f940a76-c93f-46c5-af29-5b098a54adc8-operator-scripts\") pod \"glance-db-create-5m87g\" (UID: \"1f940a76-c93f-46c5-af29-5b098a54adc8\") " pod="openstack/glance-db-create-5m87g" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.821740 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1f02-account-create-update-cslgg" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.841589 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88jsr\" (UniqueName: \"kubernetes.io/projected/1f940a76-c93f-46c5-af29-5b098a54adc8-kube-api-access-88jsr\") pod \"glance-db-create-5m87g\" (UID: \"1f940a76-c93f-46c5-af29-5b098a54adc8\") " pod="openstack/glance-db-create-5m87g" Feb 19 19:37:22 crc kubenswrapper[4722]: I0219 19:37:22.861083 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5m87g" Feb 19 19:37:23 crc kubenswrapper[4722]: I0219 19:37:23.593286 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:23 crc kubenswrapper[4722]: I0219 19:37:23.652521 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.347473 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:24 crc kubenswrapper[4722]: E0219 19:37:24.347997 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 19:37:24 crc kubenswrapper[4722]: E0219 19:37:24.348030 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 19:37:24 crc kubenswrapper[4722]: E0219 19:37:24.348123 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift podName:98dc74a5-9538-49e4-9dd0-eb2735f18d41 nodeName:}" failed. No retries permitted until 2026-02-19 19:37:28.348067041 +0000 UTC m=+1147.960417365 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift") pod "swift-storage-0" (UID: "98dc74a5-9538-49e4-9dd0-eb2735f18d41") : configmap "swift-ring-files" not found Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.368967 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-hlljb"] Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.370523 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hlljb" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.372309 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.393467 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hlljb"] Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.445705 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-q5fhk"] Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.446787 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.449109 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.449580 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.449613 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2fkf\" (UniqueName: \"kubernetes.io/projected/fd26936c-cebb-4507-92cb-45c7af5b7762-kube-api-access-h2fkf\") pod \"root-account-create-update-hlljb\" (UID: \"fd26936c-cebb-4507-92cb-45c7af5b7762\") " pod="openstack/root-account-create-update-hlljb" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.449665 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd26936c-cebb-4507-92cb-45c7af5b7762-operator-scripts\") pod \"root-account-create-update-hlljb\" (UID: \"fd26936c-cebb-4507-92cb-45c7af5b7762\") " pod="openstack/root-account-create-update-hlljb" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.449842 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.480731 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-q5fhk"] Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.551337 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnhgw\" (UniqueName: \"kubernetes.io/projected/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-kube-api-access-hnhgw\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.551500 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-scripts\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.551528 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-ring-data-devices\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.551557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2fkf\" (UniqueName: \"kubernetes.io/projected/fd26936c-cebb-4507-92cb-45c7af5b7762-kube-api-access-h2fkf\") pod \"root-account-create-update-hlljb\" (UID: \"fd26936c-cebb-4507-92cb-45c7af5b7762\") " pod="openstack/root-account-create-update-hlljb" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.551586 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd26936c-cebb-4507-92cb-45c7af5b7762-operator-scripts\") pod \"root-account-create-update-hlljb\" (UID: \"fd26936c-cebb-4507-92cb-45c7af5b7762\") " pod="openstack/root-account-create-update-hlljb" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.551613 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-dispersionconf\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.551634 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-swiftconf\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.551665 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-combined-ca-bundle\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.551696 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-etc-swift\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.552790 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd26936c-cebb-4507-92cb-45c7af5b7762-operator-scripts\") pod \"root-account-create-update-hlljb\" (UID: \"fd26936c-cebb-4507-92cb-45c7af5b7762\") " pod="openstack/root-account-create-update-hlljb" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.581271 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2fkf\" (UniqueName: \"kubernetes.io/projected/fd26936c-cebb-4507-92cb-45c7af5b7762-kube-api-access-h2fkf\") pod \"root-account-create-update-hlljb\" (UID: \"fd26936c-cebb-4507-92cb-45c7af5b7762\") " pod="openstack/root-account-create-update-hlljb" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.609661 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"78e7f1b1-be76-4f05-bd63-ff87b440e173","Type":"ContainerStarted","Data":"77e650597cadcbbcf8f49dc2aec43ffb45cc5d2a44416a9317b89a31c13904b9"} Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.636888 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=24.805993461 podStartE2EDuration="55.636868528s" podCreationTimestamp="2026-02-19 19:36:29 +0000 UTC" firstStartedPulling="2026-02-19 19:36:49.303291325 +0000 UTC m=+1108.915641649" lastFinishedPulling="2026-02-19 19:37:20.134166392 +0000 UTC m=+1139.746516716" observedRunningTime="2026-02-19 19:37:24.62857029 +0000 UTC m=+1144.240920624" watchObservedRunningTime="2026-02-19 19:37:24.636868528 +0000 UTC m=+1144.249218852" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.653213 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-swiftconf\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.654023 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-dispersionconf\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.654142 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-combined-ca-bundle\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.654326 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-etc-swift\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.654445 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnhgw\" (UniqueName: \"kubernetes.io/projected/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-kube-api-access-hnhgw\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.654672 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-scripts\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.654786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-ring-data-devices\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.654828 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-etc-swift\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.655728 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-scripts\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.655886 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-ring-data-devices\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.657244 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-swiftconf\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.658141 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-dispersionconf\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.658724 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-combined-ca-bundle\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.671266 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnhgw\" (UniqueName: \"kubernetes.io/projected/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-kube-api-access-hnhgw\") pod \"swift-ring-rebalance-q5fhk\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.715313 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hlljb" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.768807 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-2g4z6" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.776171 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.961602 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 19 19:37:24 crc kubenswrapper[4722]: I0219 19:37:24.965233 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.189253 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1f02-account-create-update-cslgg"] Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.318642 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-5m87g"] Feb 19 19:37:25 crc kubenswrapper[4722]: W0219 19:37:25.342434 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f940a76_c93f_46c5_af29_5b098a54adc8.slice/crio-35fe2b07391583c81c851153df917ed8ba6550ad626d269fb4bafb07a63eb1fe WatchSource:0}: Error finding container 35fe2b07391583c81c851153df917ed8ba6550ad626d269fb4bafb07a63eb1fe: Status 404 returned error can't find the container with id 35fe2b07391583c81c851153df917ed8ba6550ad626d269fb4bafb07a63eb1fe Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.431431 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-q5fhk"] Feb 19 19:37:25 crc kubenswrapper[4722]: W0219 19:37:25.444905 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc81edb08_7ac8_4cfc_abce_5895b8e7b59b.slice/crio-54695cf999be7e298e8d5f33dab8be8887de88cf6efd1f9abc7e57d8db760924 WatchSource:0}: Error finding container 54695cf999be7e298e8d5f33dab8be8887de88cf6efd1f9abc7e57d8db760924: Status 404 returned error can't find the container with id 54695cf999be7e298e8d5f33dab8be8887de88cf6efd1f9abc7e57d8db760924 Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.515795 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-hlljb"] Feb 19 19:37:25 crc kubenswrapper[4722]: W0219 19:37:25.518132 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd26936c_cebb_4507_92cb_45c7af5b7762.slice/crio-bf3851fd30981a68640006391374227eeffa915cb5b31de6bf56c53ab672bd40 WatchSource:0}: Error finding container bf3851fd30981a68640006391374227eeffa915cb5b31de6bf56c53ab672bd40: Status 404 returned error can't find the container with id bf3851fd30981a68640006391374227eeffa915cb5b31de6bf56c53ab672bd40 Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.618407 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hlljb" event={"ID":"fd26936c-cebb-4507-92cb-45c7af5b7762","Type":"ContainerStarted","Data":"bf3851fd30981a68640006391374227eeffa915cb5b31de6bf56c53ab672bd40"} Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.619951 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1f02-account-create-update-cslgg" event={"ID":"03387e77-59d8-4377-9a1c-dac948d84b59","Type":"ContainerStarted","Data":"d614fd1da3e70b89a53ee5e8d38b91ca481cc6e55ebe3919a12aefd8b96f7538"} Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.619988 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1f02-account-create-update-cslgg" event={"ID":"03387e77-59d8-4377-9a1c-dac948d84b59","Type":"ContainerStarted","Data":"d7437c095cf48c8adc0f2290a63f74e82e909327bc7acc88e8bdea32256fc6c2"} Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.622294 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45","Type":"ContainerStarted","Data":"fa229a7bb206de4ccc0307a479f0fa815abfa412795902c84987eb4df94f0285"} Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.622513 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.623660 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-q5fhk" event={"ID":"c81edb08-7ac8-4cfc-abce-5895b8e7b59b","Type":"ContainerStarted","Data":"54695cf999be7e298e8d5f33dab8be8887de88cf6efd1f9abc7e57d8db760924"} Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.625527 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f","Type":"ContainerStarted","Data":"37cb328a31e79626446e5419a5f224da9c1e9f252a7b3a3099897e049cefbfc4"} Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.626529 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.638749 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a08df2e8-3f03-4e9c-91cf-2890026b9d76","Type":"ContainerStarted","Data":"572f93c668d26d7ec11607aad487fa047b3c482800703fea034a7c2c7174262f"} Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.640899 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5m87g" event={"ID":"1f940a76-c93f-46c5-af29-5b098a54adc8","Type":"ContainerStarted","Data":"65724bcd3ed9cb9dac1ea77b176d69bbb52e388afbda6a5fe57b607a6390a7e4"} Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.640951 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5m87g" event={"ID":"1f940a76-c93f-46c5-af29-5b098a54adc8","Type":"ContainerStarted","Data":"35fe2b07391583c81c851153df917ed8ba6550ad626d269fb4bafb07a63eb1fe"} Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.652952 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-1f02-account-create-update-cslgg" podStartSLOduration=3.6529367539999997 podStartE2EDuration="3.652936754s" podCreationTimestamp="2026-02-19 19:37:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:37:25.644547752 +0000 UTC m=+1145.256898086" watchObservedRunningTime="2026-02-19 19:37:25.652936754 +0000 UTC m=+1145.265287068" Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.676792 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=46.885012153 podStartE2EDuration="1m2.676778715s" podCreationTimestamp="2026-02-19 19:36:23 +0000 UTC" firstStartedPulling="2026-02-19 19:36:32.770795164 +0000 UTC m=+1092.383145488" lastFinishedPulling="2026-02-19 19:36:48.562561726 +0000 UTC m=+1108.174912050" observedRunningTime="2026-02-19 19:37:25.668504347 +0000 UTC m=+1145.280854691" watchObservedRunningTime="2026-02-19 19:37:25.676778715 +0000 UTC m=+1145.289129039" Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.707814 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.333951554 podStartE2EDuration="1m3.70780039s" podCreationTimestamp="2026-02-19 19:36:22 +0000 UTC" firstStartedPulling="2026-02-19 19:36:24.866077622 +0000 UTC m=+1084.478427946" lastFinishedPulling="2026-02-19 19:36:48.239926458 +0000 UTC m=+1107.852276782" observedRunningTime="2026-02-19 19:37:25.704937311 +0000 UTC m=+1145.317287655" watchObservedRunningTime="2026-02-19 19:37:25.70780039 +0000 UTC m=+1145.320150714" Feb 19 19:37:25 crc kubenswrapper[4722]: I0219 19:37:25.729320 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-5m87g" podStartSLOduration=3.72930167 podStartE2EDuration="3.72930167s" podCreationTimestamp="2026-02-19 19:37:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:37:25.720893128 +0000 UTC m=+1145.333243472" watchObservedRunningTime="2026-02-19 19:37:25.72930167 +0000 UTC m=+1145.341651994" Feb 19 19:37:26 crc kubenswrapper[4722]: I0219 19:37:26.653360 4722 generic.go:334] "Generic (PLEG): container finished" podID="1f940a76-c93f-46c5-af29-5b098a54adc8" containerID="65724bcd3ed9cb9dac1ea77b176d69bbb52e388afbda6a5fe57b607a6390a7e4" exitCode=0 Feb 19 19:37:26 crc kubenswrapper[4722]: I0219 19:37:26.653405 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5m87g" event={"ID":"1f940a76-c93f-46c5-af29-5b098a54adc8","Type":"ContainerDied","Data":"65724bcd3ed9cb9dac1ea77b176d69bbb52e388afbda6a5fe57b607a6390a7e4"} Feb 19 19:37:26 crc kubenswrapper[4722]: I0219 19:37:26.655854 4722 generic.go:334] "Generic (PLEG): container finished" podID="fd26936c-cebb-4507-92cb-45c7af5b7762" containerID="ec6e9a5d8db1ce9bec823742a602001b48238109f03304859ab2fe4f5a1aeb10" exitCode=0 Feb 19 19:37:26 crc kubenswrapper[4722]: I0219 19:37:26.655914 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hlljb" event={"ID":"fd26936c-cebb-4507-92cb-45c7af5b7762","Type":"ContainerDied","Data":"ec6e9a5d8db1ce9bec823742a602001b48238109f03304859ab2fe4f5a1aeb10"} Feb 19 19:37:26 crc kubenswrapper[4722]: I0219 19:37:26.657850 4722 generic.go:334] "Generic (PLEG): container finished" podID="03387e77-59d8-4377-9a1c-dac948d84b59" containerID="d614fd1da3e70b89a53ee5e8d38b91ca481cc6e55ebe3919a12aefd8b96f7538" exitCode=0 Feb 19 19:37:26 crc kubenswrapper[4722]: I0219 19:37:26.657929 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1f02-account-create-update-cslgg" event={"ID":"03387e77-59d8-4377-9a1c-dac948d84b59","Type":"ContainerDied","Data":"d614fd1da3e70b89a53ee5e8d38b91ca481cc6e55ebe3919a12aefd8b96f7538"} Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.114063 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hlljb" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.212902 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-4b7g9"] Feb 19 19:37:28 crc kubenswrapper[4722]: E0219 19:37:28.213360 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd26936c-cebb-4507-92cb-45c7af5b7762" containerName="mariadb-account-create-update" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.213376 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd26936c-cebb-4507-92cb-45c7af5b7762" containerName="mariadb-account-create-update" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.213571 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd26936c-cebb-4507-92cb-45c7af5b7762" containerName="mariadb-account-create-update" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.214276 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4b7g9" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.231503 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4b7g9"] Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.233555 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd26936c-cebb-4507-92cb-45c7af5b7762-operator-scripts\") pod \"fd26936c-cebb-4507-92cb-45c7af5b7762\" (UID: \"fd26936c-cebb-4507-92cb-45c7af5b7762\") " Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.233740 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2fkf\" (UniqueName: \"kubernetes.io/projected/fd26936c-cebb-4507-92cb-45c7af5b7762-kube-api-access-h2fkf\") pod \"fd26936c-cebb-4507-92cb-45c7af5b7762\" (UID: \"fd26936c-cebb-4507-92cb-45c7af5b7762\") " Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.234822 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd26936c-cebb-4507-92cb-45c7af5b7762-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fd26936c-cebb-4507-92cb-45c7af5b7762" (UID: "fd26936c-cebb-4507-92cb-45c7af5b7762"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.235968 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5m87g" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.242619 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd26936c-cebb-4507-92cb-45c7af5b7762-kube-api-access-h2fkf" (OuterVolumeSpecName: "kube-api-access-h2fkf") pod "fd26936c-cebb-4507-92cb-45c7af5b7762" (UID: "fd26936c-cebb-4507-92cb-45c7af5b7762"). InnerVolumeSpecName "kube-api-access-h2fkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.308818 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c526-account-create-update-lmx4k"] Feb 19 19:37:28 crc kubenswrapper[4722]: E0219 19:37:28.309209 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f940a76-c93f-46c5-af29-5b098a54adc8" containerName="mariadb-database-create" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.309225 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f940a76-c93f-46c5-af29-5b098a54adc8" containerName="mariadb-database-create" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.309381 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f940a76-c93f-46c5-af29-5b098a54adc8" containerName="mariadb-database-create" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.310001 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c526-account-create-update-lmx4k" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.311758 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.328189 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c526-account-create-update-lmx4k"] Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.335269 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88jsr\" (UniqueName: \"kubernetes.io/projected/1f940a76-c93f-46c5-af29-5b098a54adc8-kube-api-access-88jsr\") pod \"1f940a76-c93f-46c5-af29-5b098a54adc8\" (UID: \"1f940a76-c93f-46c5-af29-5b098a54adc8\") " Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.335338 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f940a76-c93f-46c5-af29-5b098a54adc8-operator-scripts\") pod \"1f940a76-c93f-46c5-af29-5b098a54adc8\" (UID: \"1f940a76-c93f-46c5-af29-5b098a54adc8\") " Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.335834 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bd3ad13-0324-4c1c-9b74-eb1401f06507-operator-scripts\") pod \"keystone-db-create-4b7g9\" (UID: \"5bd3ad13-0324-4c1c-9b74-eb1401f06507\") " pod="openstack/keystone-db-create-4b7g9" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.335883 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9glrq\" (UniqueName: \"kubernetes.io/projected/5bd3ad13-0324-4c1c-9b74-eb1401f06507-kube-api-access-9glrq\") pod \"keystone-db-create-4b7g9\" (UID: \"5bd3ad13-0324-4c1c-9b74-eb1401f06507\") " pod="openstack/keystone-db-create-4b7g9" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.336041 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fd26936c-cebb-4507-92cb-45c7af5b7762-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.336060 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2fkf\" (UniqueName: \"kubernetes.io/projected/fd26936c-cebb-4507-92cb-45c7af5b7762-kube-api-access-h2fkf\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.336288 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f940a76-c93f-46c5-af29-5b098a54adc8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1f940a76-c93f-46c5-af29-5b098a54adc8" (UID: "1f940a76-c93f-46c5-af29-5b098a54adc8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.339412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f940a76-c93f-46c5-af29-5b098a54adc8-kube-api-access-88jsr" (OuterVolumeSpecName: "kube-api-access-88jsr") pod "1f940a76-c93f-46c5-af29-5b098a54adc8" (UID: "1f940a76-c93f-46c5-af29-5b098a54adc8"). InnerVolumeSpecName "kube-api-access-88jsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.436997 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bd3ad13-0324-4c1c-9b74-eb1401f06507-operator-scripts\") pod \"keystone-db-create-4b7g9\" (UID: \"5bd3ad13-0324-4c1c-9b74-eb1401f06507\") " pod="openstack/keystone-db-create-4b7g9" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.437035 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l4ln\" (UniqueName: \"kubernetes.io/projected/93536b6f-8176-4737-a547-9face2995981-kube-api-access-8l4ln\") pod \"keystone-c526-account-create-update-lmx4k\" (UID: \"93536b6f-8176-4737-a547-9face2995981\") " pod="openstack/keystone-c526-account-create-update-lmx4k" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.437061 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9glrq\" (UniqueName: \"kubernetes.io/projected/5bd3ad13-0324-4c1c-9b74-eb1401f06507-kube-api-access-9glrq\") pod \"keystone-db-create-4b7g9\" (UID: \"5bd3ad13-0324-4c1c-9b74-eb1401f06507\") " pod="openstack/keystone-db-create-4b7g9" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.437098 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.437132 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93536b6f-8176-4737-a547-9face2995981-operator-scripts\") pod \"keystone-c526-account-create-update-lmx4k\" (UID: \"93536b6f-8176-4737-a547-9face2995981\") " pod="openstack/keystone-c526-account-create-update-lmx4k" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.437212 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88jsr\" (UniqueName: \"kubernetes.io/projected/1f940a76-c93f-46c5-af29-5b098a54adc8-kube-api-access-88jsr\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.437228 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f940a76-c93f-46c5-af29-5b098a54adc8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.437859 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bd3ad13-0324-4c1c-9b74-eb1401f06507-operator-scripts\") pod \"keystone-db-create-4b7g9\" (UID: \"5bd3ad13-0324-4c1c-9b74-eb1401f06507\") " pod="openstack/keystone-db-create-4b7g9" Feb 19 19:37:28 crc kubenswrapper[4722]: E0219 19:37:28.439101 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 19:37:28 crc kubenswrapper[4722]: E0219 19:37:28.439121 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 19:37:28 crc kubenswrapper[4722]: E0219 19:37:28.439197 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift podName:98dc74a5-9538-49e4-9dd0-eb2735f18d41 nodeName:}" failed. No retries permitted until 2026-02-19 19:37:36.43918319 +0000 UTC m=+1156.051533524 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift") pod "swift-storage-0" (UID: "98dc74a5-9538-49e4-9dd0-eb2735f18d41") : configmap "swift-ring-files" not found Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.453366 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9glrq\" (UniqueName: \"kubernetes.io/projected/5bd3ad13-0324-4c1c-9b74-eb1401f06507-kube-api-access-9glrq\") pod \"keystone-db-create-4b7g9\" (UID: \"5bd3ad13-0324-4c1c-9b74-eb1401f06507\") " pod="openstack/keystone-db-create-4b7g9" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.494548 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-lqnqr"] Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.495708 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lqnqr" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.510386 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2439-account-create-update-lqmn5"] Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.511589 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2439-account-create-update-lqmn5" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.512899 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.519061 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-lqnqr"] Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.528616 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2439-account-create-update-lqmn5"] Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.538298 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l4ln\" (UniqueName: \"kubernetes.io/projected/93536b6f-8176-4737-a547-9face2995981-kube-api-access-8l4ln\") pod \"keystone-c526-account-create-update-lmx4k\" (UID: \"93536b6f-8176-4737-a547-9face2995981\") " pod="openstack/keystone-c526-account-create-update-lmx4k" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.538353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28cwg\" (UniqueName: \"kubernetes.io/projected/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-kube-api-access-28cwg\") pod \"placement-db-create-lqnqr\" (UID: \"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358\") " pod="openstack/placement-db-create-lqnqr" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.538421 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93536b6f-8176-4737-a547-9face2995981-operator-scripts\") pod \"keystone-c526-account-create-update-lmx4k\" (UID: \"93536b6f-8176-4737-a547-9face2995981\") " pod="openstack/keystone-c526-account-create-update-lmx4k" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.538457 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-operator-scripts\") pod \"placement-db-create-lqnqr\" (UID: \"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358\") " pod="openstack/placement-db-create-lqnqr" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.539048 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93536b6f-8176-4737-a547-9face2995981-operator-scripts\") pod \"keystone-c526-account-create-update-lmx4k\" (UID: \"93536b6f-8176-4737-a547-9face2995981\") " pod="openstack/keystone-c526-account-create-update-lmx4k" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.560588 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l4ln\" (UniqueName: \"kubernetes.io/projected/93536b6f-8176-4737-a547-9face2995981-kube-api-access-8l4ln\") pod \"keystone-c526-account-create-update-lmx4k\" (UID: \"93536b6f-8176-4737-a547-9face2995981\") " pod="openstack/keystone-c526-account-create-update-lmx4k" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.585806 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4b7g9" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.624897 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c526-account-create-update-lmx4k" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.640556 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28cwg\" (UniqueName: \"kubernetes.io/projected/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-kube-api-access-28cwg\") pod \"placement-db-create-lqnqr\" (UID: \"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358\") " pod="openstack/placement-db-create-lqnqr" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.640665 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44afb335-8449-4492-a772-78889877810e-operator-scripts\") pod \"placement-2439-account-create-update-lqmn5\" (UID: \"44afb335-8449-4492-a772-78889877810e\") " pod="openstack/placement-2439-account-create-update-lqmn5" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.640690 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-operator-scripts\") pod \"placement-db-create-lqnqr\" (UID: \"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358\") " pod="openstack/placement-db-create-lqnqr" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.640793 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt78j\" (UniqueName: \"kubernetes.io/projected/44afb335-8449-4492-a772-78889877810e-kube-api-access-xt78j\") pod \"placement-2439-account-create-update-lqmn5\" (UID: \"44afb335-8449-4492-a772-78889877810e\") " pod="openstack/placement-2439-account-create-update-lqmn5" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.641567 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-operator-scripts\") pod \"placement-db-create-lqnqr\" (UID: \"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358\") " pod="openstack/placement-db-create-lqnqr" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.659564 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28cwg\" (UniqueName: \"kubernetes.io/projected/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-kube-api-access-28cwg\") pod \"placement-db-create-lqnqr\" (UID: \"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358\") " pod="openstack/placement-db-create-lqnqr" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.678173 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a08df2e8-3f03-4e9c-91cf-2890026b9d76","Type":"ContainerStarted","Data":"c2baa075267fa149454aabf4b426a4fea2dd3c3a6aa19421e7bc91c894e1e821"} Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.679977 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-5m87g" event={"ID":"1f940a76-c93f-46c5-af29-5b098a54adc8","Type":"ContainerDied","Data":"35fe2b07391583c81c851153df917ed8ba6550ad626d269fb4bafb07a63eb1fe"} Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.680001 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35fe2b07391583c81c851153df917ed8ba6550ad626d269fb4bafb07a63eb1fe" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.680050 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-5m87g" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.683116 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-hlljb" event={"ID":"fd26936c-cebb-4507-92cb-45c7af5b7762","Type":"ContainerDied","Data":"bf3851fd30981a68640006391374227eeffa915cb5b31de6bf56c53ab672bd40"} Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.683182 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf3851fd30981a68640006391374227eeffa915cb5b31de6bf56c53ab672bd40" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.683204 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-hlljb" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.742794 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt78j\" (UniqueName: \"kubernetes.io/projected/44afb335-8449-4492-a772-78889877810e-kube-api-access-xt78j\") pod \"placement-2439-account-create-update-lqmn5\" (UID: \"44afb335-8449-4492-a772-78889877810e\") " pod="openstack/placement-2439-account-create-update-lqmn5" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.743385 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44afb335-8449-4492-a772-78889877810e-operator-scripts\") pod \"placement-2439-account-create-update-lqmn5\" (UID: \"44afb335-8449-4492-a772-78889877810e\") " pod="openstack/placement-2439-account-create-update-lqmn5" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.744214 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44afb335-8449-4492-a772-78889877810e-operator-scripts\") pod \"placement-2439-account-create-update-lqmn5\" (UID: \"44afb335-8449-4492-a772-78889877810e\") " pod="openstack/placement-2439-account-create-update-lqmn5" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.787903 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt78j\" (UniqueName: \"kubernetes.io/projected/44afb335-8449-4492-a772-78889877810e-kube-api-access-xt78j\") pod \"placement-2439-account-create-update-lqmn5\" (UID: \"44afb335-8449-4492-a772-78889877810e\") " pod="openstack/placement-2439-account-create-update-lqmn5" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.812091 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lqnqr" Feb 19 19:37:28 crc kubenswrapper[4722]: I0219 19:37:28.828600 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2439-account-create-update-lqmn5" Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.368535 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.643929 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1f02-account-create-update-cslgg" Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.673358 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.738056 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1f02-account-create-update-cslgg" event={"ID":"03387e77-59d8-4377-9a1c-dac948d84b59","Type":"ContainerDied","Data":"d7437c095cf48c8adc0f2290a63f74e82e909327bc7acc88e8bdea32256fc6c2"} Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.738107 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7437c095cf48c8adc0f2290a63f74e82e909327bc7acc88e8bdea32256fc6c2" Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.738268 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1f02-account-create-update-cslgg" Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.741426 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xtsln"] Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.741617 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" podUID="dfec288d-0744-48b4-8fcb-9ba349ebb6c4" containerName="dnsmasq-dns" containerID="cri-o://044ff08c5dbbd2f41c731beab45cb688557289abbb1920032c7fa0385f11e9f7" gracePeriod=10 Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.761662 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03387e77-59d8-4377-9a1c-dac948d84b59-operator-scripts\") pod \"03387e77-59d8-4377-9a1c-dac948d84b59\" (UID: \"03387e77-59d8-4377-9a1c-dac948d84b59\") " Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.761821 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j4s2\" (UniqueName: \"kubernetes.io/projected/03387e77-59d8-4377-9a1c-dac948d84b59-kube-api-access-8j4s2\") pod \"03387e77-59d8-4377-9a1c-dac948d84b59\" (UID: \"03387e77-59d8-4377-9a1c-dac948d84b59\") " Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.762225 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03387e77-59d8-4377-9a1c-dac948d84b59-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "03387e77-59d8-4377-9a1c-dac948d84b59" (UID: "03387e77-59d8-4377-9a1c-dac948d84b59"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.762431 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03387e77-59d8-4377-9a1c-dac948d84b59-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.768438 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03387e77-59d8-4377-9a1c-dac948d84b59-kube-api-access-8j4s2" (OuterVolumeSpecName: "kube-api-access-8j4s2") pod "03387e77-59d8-4377-9a1c-dac948d84b59" (UID: "03387e77-59d8-4377-9a1c-dac948d84b59"). InnerVolumeSpecName "kube-api-access-8j4s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:29 crc kubenswrapper[4722]: I0219 19:37:29.864399 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j4s2\" (UniqueName: \"kubernetes.io/projected/03387e77-59d8-4377-9a1c-dac948d84b59-kube-api-access-8j4s2\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:30 crc kubenswrapper[4722]: I0219 19:37:30.483822 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-llw6c" Feb 19 19:37:30 crc kubenswrapper[4722]: I0219 19:37:30.747609 4722 generic.go:334] "Generic (PLEG): container finished" podID="dfec288d-0744-48b4-8fcb-9ba349ebb6c4" containerID="044ff08c5dbbd2f41c731beab45cb688557289abbb1920032c7fa0385f11e9f7" exitCode=0 Feb 19 19:37:30 crc kubenswrapper[4722]: I0219 19:37:30.747663 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" event={"ID":"dfec288d-0744-48b4-8fcb-9ba349ebb6c4","Type":"ContainerDied","Data":"044ff08c5dbbd2f41c731beab45cb688557289abbb1920032c7fa0385f11e9f7"} Feb 19 19:37:30 crc kubenswrapper[4722]: I0219 19:37:30.760472 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-k6gcm" Feb 19 19:37:30 crc kubenswrapper[4722]: I0219 19:37:30.813837 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8" Feb 19 19:37:30 crc kubenswrapper[4722]: I0219 19:37:30.870403 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-hlljb"] Feb 19 19:37:30 crc kubenswrapper[4722]: I0219 19:37:30.889431 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-hlljb"] Feb 19 19:37:31 crc kubenswrapper[4722]: I0219 19:37:31.087813 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd26936c-cebb-4507-92cb-45c7af5b7762" path="/var/lib/kubelet/pods/fd26936c-cebb-4507-92cb-45c7af5b7762/volumes" Feb 19 19:37:31 crc kubenswrapper[4722]: I0219 19:37:31.757735 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 19 19:37:31 crc kubenswrapper[4722]: I0219 19:37:31.951583 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="a3fc19f1-6f9f-4f35-a391-1f6743480bd3" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.725888 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-8fd9q"] Feb 19 19:37:32 crc kubenswrapper[4722]: E0219 19:37:32.726393 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03387e77-59d8-4377-9a1c-dac948d84b59" containerName="mariadb-account-create-update" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.726409 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="03387e77-59d8-4377-9a1c-dac948d84b59" containerName="mariadb-account-create-update" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.726656 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="03387e77-59d8-4377-9a1c-dac948d84b59" containerName="mariadb-account-create-update" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.727419 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.731120 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.731625 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9s8kl" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.771955 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8fd9q"] Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.807743 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" event={"ID":"dfec288d-0744-48b4-8fcb-9ba349ebb6c4","Type":"ContainerDied","Data":"b3bc6e7378e1e488a78e74c0d758a8daf6d37020e25d79f7587034900624dac5"} Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.808100 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3bc6e7378e1e488a78e74c0d758a8daf6d37020e25d79f7587034900624dac5" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.821389 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.833729 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28qs4\" (UniqueName: \"kubernetes.io/projected/619d59b3-6514-4648-9007-6e9ce3427c3a-kube-api-access-28qs4\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.833792 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-config-data\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.833981 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-db-sync-config-data\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.834330 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-combined-ca-bundle\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.936468 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-sb\") pod \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.936595 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-dns-svc\") pod \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.936655 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-nb\") pod \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.937131 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-config\") pod \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.937281 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw227\" (UniqueName: \"kubernetes.io/projected/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-kube-api-access-cw227\") pod \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\" (UID: \"dfec288d-0744-48b4-8fcb-9ba349ebb6c4\") " Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.937711 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-db-sync-config-data\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.937981 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-combined-ca-bundle\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.938050 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28qs4\" (UniqueName: \"kubernetes.io/projected/619d59b3-6514-4648-9007-6e9ce3427c3a-kube-api-access-28qs4\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.938090 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-config-data\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.946540 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-kube-api-access-cw227" (OuterVolumeSpecName: "kube-api-access-cw227") pod "dfec288d-0744-48b4-8fcb-9ba349ebb6c4" (UID: "dfec288d-0744-48b4-8fcb-9ba349ebb6c4"). InnerVolumeSpecName "kube-api-access-cw227". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.947443 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-combined-ca-bundle\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.950066 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-db-sync-config-data\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.956222 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-config-data\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:32 crc kubenswrapper[4722]: I0219 19:37:32.970670 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28qs4\" (UniqueName: \"kubernetes.io/projected/619d59b3-6514-4648-9007-6e9ce3427c3a-kube-api-access-28qs4\") pod \"glance-db-sync-8fd9q\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.032521 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dfec288d-0744-48b4-8fcb-9ba349ebb6c4" (UID: "dfec288d-0744-48b4-8fcb-9ba349ebb6c4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.040369 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.040410 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw227\" (UniqueName: \"kubernetes.io/projected/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-kube-api-access-cw227\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.043789 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-config" (OuterVolumeSpecName: "config") pod "dfec288d-0744-48b4-8fcb-9ba349ebb6c4" (UID: "dfec288d-0744-48b4-8fcb-9ba349ebb6c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.056893 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dfec288d-0744-48b4-8fcb-9ba349ebb6c4" (UID: "dfec288d-0744-48b4-8fcb-9ba349ebb6c4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.077089 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dfec288d-0744-48b4-8fcb-9ba349ebb6c4" (UID: "dfec288d-0744-48b4-8fcb-9ba349ebb6c4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.089606 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4b7g9"] Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.146523 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.146564 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.146577 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfec288d-0744-48b4-8fcb-9ba349ebb6c4-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.187455 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.233055 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c526-account-create-update-lmx4k"] Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.498784 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-lqnqr"] Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.514065 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2439-account-create-update-lqmn5"] Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.799555 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-8fd9q"] Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.816047 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2439-account-create-update-lqmn5" event={"ID":"44afb335-8449-4492-a772-78889877810e","Type":"ContainerStarted","Data":"f892930a6a254a604ca1f62b197a46b6f4adf5d1a23ff675a6f4dceae3710829"} Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.817514 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lqnqr" event={"ID":"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358","Type":"ContainerStarted","Data":"1eb7fc48bd8b72e0b5c8cc94587a61101731402ad9ff8b02d1b1e5d0d69c49be"} Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.818973 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-q5fhk" event={"ID":"c81edb08-7ac8-4cfc-abce-5895b8e7b59b","Type":"ContainerStarted","Data":"827ec543eff6496863bdbf6ae3908b628e0d5862787c9446d39fe5652d9dbfa4"} Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.820603 4722 generic.go:334] "Generic (PLEG): container finished" podID="5bd3ad13-0324-4c1c-9b74-eb1401f06507" containerID="99c98b71002ac8948511844b6989a0da14ae66e034112843908355f3a72c44e7" exitCode=0 Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.820674 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4b7g9" event={"ID":"5bd3ad13-0324-4c1c-9b74-eb1401f06507","Type":"ContainerDied","Data":"99c98b71002ac8948511844b6989a0da14ae66e034112843908355f3a72c44e7"} Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.820699 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4b7g9" event={"ID":"5bd3ad13-0324-4c1c-9b74-eb1401f06507","Type":"ContainerStarted","Data":"5b71ad4ca4512f9223138f370e3a52a25097fcb37a438f94f9b1595c0fb1c496"} Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.821858 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c526-account-create-update-lmx4k" event={"ID":"93536b6f-8176-4737-a547-9face2995981","Type":"ContainerStarted","Data":"d43cf646287fe537785df6cf6532f9d6502d5c80eb8ccdd82b930f04b64f53a1"} Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.825180 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a08df2e8-3f03-4e9c-91cf-2890026b9d76","Type":"ContainerStarted","Data":"efe902350ad0886c73508abcb086ad6fc6e169270b01937b8957c668bd35bc1d"} Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.825191 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-xtsln" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.840465 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-q5fhk" podStartSLOduration=2.654601531 podStartE2EDuration="9.840442164s" podCreationTimestamp="2026-02-19 19:37:24 +0000 UTC" firstStartedPulling="2026-02-19 19:37:25.447688407 +0000 UTC m=+1145.060038731" lastFinishedPulling="2026-02-19 19:37:32.63352904 +0000 UTC m=+1152.245879364" observedRunningTime="2026-02-19 19:37:33.835074377 +0000 UTC m=+1153.447424701" watchObservedRunningTime="2026-02-19 19:37:33.840442164 +0000 UTC m=+1153.452792488" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.889409 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=22.193164119 podStartE2EDuration="1m4.889391337s" podCreationTimestamp="2026-02-19 19:36:29 +0000 UTC" firstStartedPulling="2026-02-19 19:36:49.993832881 +0000 UTC m=+1109.606183205" lastFinishedPulling="2026-02-19 19:37:32.690060099 +0000 UTC m=+1152.302410423" observedRunningTime="2026-02-19 19:37:33.8856407 +0000 UTC m=+1153.497991034" watchObservedRunningTime="2026-02-19 19:37:33.889391337 +0000 UTC m=+1153.501741661" Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.915982 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xtsln"] Feb 19 19:37:33 crc kubenswrapper[4722]: I0219 19:37:33.922190 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-xtsln"] Feb 19 19:37:34 crc kubenswrapper[4722]: I0219 19:37:34.129743 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Feb 19 19:37:34 crc kubenswrapper[4722]: I0219 19:37:34.361977 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 19 19:37:34 crc kubenswrapper[4722]: I0219 19:37:34.771894 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:37:34 crc kubenswrapper[4722]: I0219 19:37:34.847463 4722 generic.go:334] "Generic (PLEG): container finished" podID="248de930-2ecc-4ca2-9b2c-e9b8ccbc6358" containerID="2e209875892b5272f7bb00341b24fa8e6b2be48cf1bccfa8acb4859e6aeca425" exitCode=0 Feb 19 19:37:34 crc kubenswrapper[4722]: I0219 19:37:34.847596 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lqnqr" event={"ID":"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358","Type":"ContainerDied","Data":"2e209875892b5272f7bb00341b24fa8e6b2be48cf1bccfa8acb4859e6aeca425"} Feb 19 19:37:34 crc kubenswrapper[4722]: I0219 19:37:34.850464 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8fd9q" event={"ID":"619d59b3-6514-4648-9007-6e9ce3427c3a","Type":"ContainerStarted","Data":"f0c405b64fff456aecf84e0cb3dfbb788e3a93a4e01a434dac62f40edc004d0e"} Feb 19 19:37:34 crc kubenswrapper[4722]: I0219 19:37:34.853084 4722 generic.go:334] "Generic (PLEG): container finished" podID="93536b6f-8176-4737-a547-9face2995981" containerID="687c2f6cd621666c11c3a553d69b13af20c5311d98a27db188d1d7153219352e" exitCode=0 Feb 19 19:37:34 crc kubenswrapper[4722]: I0219 19:37:34.853193 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c526-account-create-update-lmx4k" event={"ID":"93536b6f-8176-4737-a547-9face2995981","Type":"ContainerDied","Data":"687c2f6cd621666c11c3a553d69b13af20c5311d98a27db188d1d7153219352e"} Feb 19 19:37:34 crc kubenswrapper[4722]: I0219 19:37:34.855576 4722 generic.go:334] "Generic (PLEG): container finished" podID="44afb335-8449-4492-a772-78889877810e" containerID="c2f010a6f9fb7a90aca42363ebf34cb5a6a44700de8e1351f8ac807b74981bd2" exitCode=0 Feb 19 19:37:34 crc kubenswrapper[4722]: I0219 19:37:34.855878 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2439-account-create-update-lqmn5" event={"ID":"44afb335-8449-4492-a772-78889877810e","Type":"ContainerDied","Data":"c2f010a6f9fb7a90aca42363ebf34cb5a6a44700de8e1351f8ac807b74981bd2"} Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.085762 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfec288d-0744-48b4-8fcb-9ba349ebb6c4" path="/var/lib/kubelet/pods/dfec288d-0744-48b4-8fcb-9ba349ebb6c4/volumes" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.231173 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4b7g9" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.292985 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bd3ad13-0324-4c1c-9b74-eb1401f06507-operator-scripts\") pod \"5bd3ad13-0324-4c1c-9b74-eb1401f06507\" (UID: \"5bd3ad13-0324-4c1c-9b74-eb1401f06507\") " Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.293213 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9glrq\" (UniqueName: \"kubernetes.io/projected/5bd3ad13-0324-4c1c-9b74-eb1401f06507-kube-api-access-9glrq\") pod \"5bd3ad13-0324-4c1c-9b74-eb1401f06507\" (UID: \"5bd3ad13-0324-4c1c-9b74-eb1401f06507\") " Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.293726 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bd3ad13-0324-4c1c-9b74-eb1401f06507-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5bd3ad13-0324-4c1c-9b74-eb1401f06507" (UID: "5bd3ad13-0324-4c1c-9b74-eb1401f06507"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.298960 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bd3ad13-0324-4c1c-9b74-eb1401f06507-kube-api-access-9glrq" (OuterVolumeSpecName: "kube-api-access-9glrq") pod "5bd3ad13-0324-4c1c-9b74-eb1401f06507" (UID: "5bd3ad13-0324-4c1c-9b74-eb1401f06507"). InnerVolumeSpecName "kube-api-access-9glrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.394741 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9glrq\" (UniqueName: \"kubernetes.io/projected/5bd3ad13-0324-4c1c-9b74-eb1401f06507-kube-api-access-9glrq\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.394783 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bd3ad13-0324-4c1c-9b74-eb1401f06507-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.868144 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4b7g9" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.868647 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4b7g9" event={"ID":"5bd3ad13-0324-4c1c-9b74-eb1401f06507","Type":"ContainerDied","Data":"5b71ad4ca4512f9223138f370e3a52a25097fcb37a438f94f9b1595c0fb1c496"} Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.868694 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b71ad4ca4512f9223138f370e3a52a25097fcb37a438f94f9b1595c0fb1c496" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.868926 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-2wnjc"] Feb 19 19:37:35 crc kubenswrapper[4722]: E0219 19:37:35.869477 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfec288d-0744-48b4-8fcb-9ba349ebb6c4" containerName="dnsmasq-dns" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.869508 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfec288d-0744-48b4-8fcb-9ba349ebb6c4" containerName="dnsmasq-dns" Feb 19 19:37:35 crc kubenswrapper[4722]: E0219 19:37:35.869557 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bd3ad13-0324-4c1c-9b74-eb1401f06507" containerName="mariadb-database-create" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.869571 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bd3ad13-0324-4c1c-9b74-eb1401f06507" containerName="mariadb-database-create" Feb 19 19:37:35 crc kubenswrapper[4722]: E0219 19:37:35.869597 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfec288d-0744-48b4-8fcb-9ba349ebb6c4" containerName="init" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.869614 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfec288d-0744-48b4-8fcb-9ba349ebb6c4" containerName="init" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.869925 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bd3ad13-0324-4c1c-9b74-eb1401f06507" containerName="mariadb-database-create" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.869970 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfec288d-0744-48b4-8fcb-9ba349ebb6c4" containerName="dnsmasq-dns" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.871018 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2wnjc" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.882395 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.886965 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2wnjc"] Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.905348 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2gtx\" (UniqueName: \"kubernetes.io/projected/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-kube-api-access-x2gtx\") pod \"root-account-create-update-2wnjc\" (UID: \"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8\") " pod="openstack/root-account-create-update-2wnjc" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.905386 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-operator-scripts\") pod \"root-account-create-update-2wnjc\" (UID: \"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8\") " pod="openstack/root-account-create-update-2wnjc" Feb 19 19:37:35 crc kubenswrapper[4722]: I0219 19:37:35.944049 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.007582 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2gtx\" (UniqueName: \"kubernetes.io/projected/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-kube-api-access-x2gtx\") pod \"root-account-create-update-2wnjc\" (UID: \"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8\") " pod="openstack/root-account-create-update-2wnjc" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.007952 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-operator-scripts\") pod \"root-account-create-update-2wnjc\" (UID: \"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8\") " pod="openstack/root-account-create-update-2wnjc" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.010301 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-operator-scripts\") pod \"root-account-create-update-2wnjc\" (UID: \"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8\") " pod="openstack/root-account-create-update-2wnjc" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.042850 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2gtx\" (UniqueName: \"kubernetes.io/projected/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-kube-api-access-x2gtx\") pod \"root-account-create-update-2wnjc\" (UID: \"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8\") " pod="openstack/root-account-create-update-2wnjc" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.197423 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2wnjc" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.247802 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c526-account-create-update-lmx4k" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.323196 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93536b6f-8176-4737-a547-9face2995981-operator-scripts\") pod \"93536b6f-8176-4737-a547-9face2995981\" (UID: \"93536b6f-8176-4737-a547-9face2995981\") " Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.323341 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l4ln\" (UniqueName: \"kubernetes.io/projected/93536b6f-8176-4737-a547-9face2995981-kube-api-access-8l4ln\") pod \"93536b6f-8176-4737-a547-9face2995981\" (UID: \"93536b6f-8176-4737-a547-9face2995981\") " Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.325435 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93536b6f-8176-4737-a547-9face2995981-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93536b6f-8176-4737-a547-9face2995981" (UID: "93536b6f-8176-4737-a547-9face2995981"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.330005 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93536b6f-8176-4737-a547-9face2995981-kube-api-access-8l4ln" (OuterVolumeSpecName: "kube-api-access-8l4ln") pod "93536b6f-8176-4737-a547-9face2995981" (UID: "93536b6f-8176-4737-a547-9face2995981"). InnerVolumeSpecName "kube-api-access-8l4ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.426327 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l4ln\" (UniqueName: \"kubernetes.io/projected/93536b6f-8176-4737-a547-9face2995981-kube-api-access-8l4ln\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.426357 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93536b6f-8176-4737-a547-9face2995981-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.508766 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lqnqr" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.517417 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2439-account-create-update-lqmn5" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.529768 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt78j\" (UniqueName: \"kubernetes.io/projected/44afb335-8449-4492-a772-78889877810e-kube-api-access-xt78j\") pod \"44afb335-8449-4492-a772-78889877810e\" (UID: \"44afb335-8449-4492-a772-78889877810e\") " Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.529837 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44afb335-8449-4492-a772-78889877810e-operator-scripts\") pod \"44afb335-8449-4492-a772-78889877810e\" (UID: \"44afb335-8449-4492-a772-78889877810e\") " Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.529880 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-operator-scripts\") pod \"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358\" (UID: \"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358\") " Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.529907 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28cwg\" (UniqueName: \"kubernetes.io/projected/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-kube-api-access-28cwg\") pod \"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358\" (UID: \"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358\") " Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.530230 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:36 crc kubenswrapper[4722]: E0219 19:37:36.530463 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 19:37:36 crc kubenswrapper[4722]: E0219 19:37:36.530481 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 19:37:36 crc kubenswrapper[4722]: E0219 19:37:36.530527 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift podName:98dc74a5-9538-49e4-9dd0-eb2735f18d41 nodeName:}" failed. No retries permitted until 2026-02-19 19:37:52.530511337 +0000 UTC m=+1172.142861661 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift") pod "swift-storage-0" (UID: "98dc74a5-9538-49e4-9dd0-eb2735f18d41") : configmap "swift-ring-files" not found Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.530815 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "248de930-2ecc-4ca2-9b2c-e9b8ccbc6358" (UID: "248de930-2ecc-4ca2-9b2c-e9b8ccbc6358"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.531298 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44afb335-8449-4492-a772-78889877810e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44afb335-8449-4492-a772-78889877810e" (UID: "44afb335-8449-4492-a772-78889877810e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.536339 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-kube-api-access-28cwg" (OuterVolumeSpecName: "kube-api-access-28cwg") pod "248de930-2ecc-4ca2-9b2c-e9b8ccbc6358" (UID: "248de930-2ecc-4ca2-9b2c-e9b8ccbc6358"). InnerVolumeSpecName "kube-api-access-28cwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.542379 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44afb335-8449-4492-a772-78889877810e-kube-api-access-xt78j" (OuterVolumeSpecName: "kube-api-access-xt78j") pod "44afb335-8449-4492-a772-78889877810e" (UID: "44afb335-8449-4492-a772-78889877810e"). InnerVolumeSpecName "kube-api-access-xt78j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.639237 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt78j\" (UniqueName: \"kubernetes.io/projected/44afb335-8449-4492-a772-78889877810e-kube-api-access-xt78j\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.639281 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44afb335-8449-4492-a772-78889877810e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.639292 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.639299 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28cwg\" (UniqueName: \"kubernetes.io/projected/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358-kube-api-access-28cwg\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.880412 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2439-account-create-update-lqmn5" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.880665 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2439-account-create-update-lqmn5" event={"ID":"44afb335-8449-4492-a772-78889877810e","Type":"ContainerDied","Data":"f892930a6a254a604ca1f62b197a46b6f4adf5d1a23ff675a6f4dceae3710829"} Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.880709 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f892930a6a254a604ca1f62b197a46b6f4adf5d1a23ff675a6f4dceae3710829" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.883661 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lqnqr" event={"ID":"248de930-2ecc-4ca2-9b2c-e9b8ccbc6358","Type":"ContainerDied","Data":"1eb7fc48bd8b72e0b5c8cc94587a61101731402ad9ff8b02d1b1e5d0d69c49be"} Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.883697 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1eb7fc48bd8b72e0b5c8cc94587a61101731402ad9ff8b02d1b1e5d0d69c49be" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.883769 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lqnqr" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.893073 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c526-account-create-update-lmx4k" event={"ID":"93536b6f-8176-4737-a547-9face2995981","Type":"ContainerDied","Data":"d43cf646287fe537785df6cf6532f9d6502d5c80eb8ccdd82b930f04b64f53a1"} Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.893111 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d43cf646287fe537785df6cf6532f9d6502d5c80eb8ccdd82b930f04b64f53a1" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.893198 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c526-account-create-update-lmx4k" Feb 19 19:37:36 crc kubenswrapper[4722]: I0219 19:37:36.903750 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2wnjc"] Feb 19 19:37:36 crc kubenswrapper[4722]: W0219 19:37:36.912960 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fc542bf_bcc1_48b0_b0d9_a1c4e2702cc8.slice/crio-c1eaa478fa5fe2cdff0e2ccac60695defe60b569a03b99813333c5999f0290b7 WatchSource:0}: Error finding container c1eaa478fa5fe2cdff0e2ccac60695defe60b569a03b99813333c5999f0290b7: Status 404 returned error can't find the container with id c1eaa478fa5fe2cdff0e2ccac60695defe60b569a03b99813333c5999f0290b7 Feb 19 19:37:37 crc kubenswrapper[4722]: I0219 19:37:37.903825 4722 generic.go:334] "Generic (PLEG): container finished" podID="6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8" containerID="8aa3bea30fad3f939a077228a9ed1250c050038afc03ce315c796a876ab91692" exitCode=0 Feb 19 19:37:37 crc kubenswrapper[4722]: I0219 19:37:37.903887 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2wnjc" event={"ID":"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8","Type":"ContainerDied","Data":"8aa3bea30fad3f939a077228a9ed1250c050038afc03ce315c796a876ab91692"} Feb 19 19:37:37 crc kubenswrapper[4722]: I0219 19:37:37.904110 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2wnjc" event={"ID":"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8","Type":"ContainerStarted","Data":"c1eaa478fa5fe2cdff0e2ccac60695defe60b569a03b99813333c5999f0290b7"} Feb 19 19:37:38 crc kubenswrapper[4722]: I0219 19:37:38.496323 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6tmmr" podUID="293cde43-7bcf-4638-a080-badb26c81138" containerName="ovn-controller" probeResult="failure" output=< Feb 19 19:37:38 crc kubenswrapper[4722]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 19:37:38 crc kubenswrapper[4722]: > Feb 19 19:37:39 crc kubenswrapper[4722]: I0219 19:37:39.321070 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2wnjc" Feb 19 19:37:39 crc kubenswrapper[4722]: I0219 19:37:39.392437 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2gtx\" (UniqueName: \"kubernetes.io/projected/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-kube-api-access-x2gtx\") pod \"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8\" (UID: \"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8\") " Feb 19 19:37:39 crc kubenswrapper[4722]: I0219 19:37:39.392764 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-operator-scripts\") pod \"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8\" (UID: \"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8\") " Feb 19 19:37:39 crc kubenswrapper[4722]: I0219 19:37:39.393203 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8" (UID: "6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:39 crc kubenswrapper[4722]: I0219 19:37:39.397405 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-kube-api-access-x2gtx" (OuterVolumeSpecName: "kube-api-access-x2gtx") pod "6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8" (UID: "6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8"). InnerVolumeSpecName "kube-api-access-x2gtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:39 crc kubenswrapper[4722]: I0219 19:37:39.496280 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2gtx\" (UniqueName: \"kubernetes.io/projected/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-kube-api-access-x2gtx\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:39 crc kubenswrapper[4722]: I0219 19:37:39.496623 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:39 crc kubenswrapper[4722]: I0219 19:37:39.920695 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2wnjc" event={"ID":"6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8","Type":"ContainerDied","Data":"c1eaa478fa5fe2cdff0e2ccac60695defe60b569a03b99813333c5999f0290b7"} Feb 19 19:37:39 crc kubenswrapper[4722]: I0219 19:37:39.920729 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1eaa478fa5fe2cdff0e2ccac60695defe60b569a03b99813333c5999f0290b7" Feb 19 19:37:39 crc kubenswrapper[4722]: I0219 19:37:39.920778 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2wnjc" Feb 19 19:37:40 crc kubenswrapper[4722]: I0219 19:37:40.931808 4722 generic.go:334] "Generic (PLEG): container finished" podID="c81edb08-7ac8-4cfc-abce-5895b8e7b59b" containerID="827ec543eff6496863bdbf6ae3908b628e0d5862787c9446d39fe5652d9dbfa4" exitCode=0 Feb 19 19:37:40 crc kubenswrapper[4722]: I0219 19:37:40.931887 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-q5fhk" event={"ID":"c81edb08-7ac8-4cfc-abce-5895b8e7b59b","Type":"ContainerDied","Data":"827ec543eff6496863bdbf6ae3908b628e0d5862787c9446d39fe5652d9dbfa4"} Feb 19 19:37:41 crc kubenswrapper[4722]: I0219 19:37:41.954040 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="a3fc19f1-6f9f-4f35-a391-1f6743480bd3" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 19 19:37:43 crc kubenswrapper[4722]: I0219 19:37:43.500248 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6tmmr" podUID="293cde43-7bcf-4638-a080-badb26c81138" containerName="ovn-controller" probeResult="failure" output=< Feb 19 19:37:43 crc kubenswrapper[4722]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 19:37:43 crc kubenswrapper[4722]: > Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.130354 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.480016 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-67cbt"] Feb 19 19:37:44 crc kubenswrapper[4722]: E0219 19:37:44.480417 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="248de930-2ecc-4ca2-9b2c-e9b8ccbc6358" containerName="mariadb-database-create" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.480440 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="248de930-2ecc-4ca2-9b2c-e9b8ccbc6358" containerName="mariadb-database-create" Feb 19 19:37:44 crc kubenswrapper[4722]: E0219 19:37:44.480468 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93536b6f-8176-4737-a547-9face2995981" containerName="mariadb-account-create-update" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.480478 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="93536b6f-8176-4737-a547-9face2995981" containerName="mariadb-account-create-update" Feb 19 19:37:44 crc kubenswrapper[4722]: E0219 19:37:44.480488 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8" containerName="mariadb-account-create-update" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.480494 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8" containerName="mariadb-account-create-update" Feb 19 19:37:44 crc kubenswrapper[4722]: E0219 19:37:44.480514 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44afb335-8449-4492-a772-78889877810e" containerName="mariadb-account-create-update" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.480521 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="44afb335-8449-4492-a772-78889877810e" containerName="mariadb-account-create-update" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.480667 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8" containerName="mariadb-account-create-update" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.480684 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="93536b6f-8176-4737-a547-9face2995981" containerName="mariadb-account-create-update" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.480698 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="248de930-2ecc-4ca2-9b2c-e9b8ccbc6358" containerName="mariadb-database-create" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.480708 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="44afb335-8449-4492-a772-78889877810e" containerName="mariadb-account-create-update" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.482789 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-67cbt" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.491221 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-67cbt"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.580556 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-nqj2r"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.581745 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-nqj2r" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.591192 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcg74\" (UniqueName: \"kubernetes.io/projected/fe445148-46c0-4e8c-844a-51a5ce323370-kube-api-access-jcg74\") pod \"cinder-db-create-67cbt\" (UID: \"fe445148-46c0-4e8c-844a-51a5ce323370\") " pod="openstack/cinder-db-create-67cbt" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.591236 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe445148-46c0-4e8c-844a-51a5ce323370-operator-scripts\") pod \"cinder-db-create-67cbt\" (UID: \"fe445148-46c0-4e8c-844a-51a5ce323370\") " pod="openstack/cinder-db-create-67cbt" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.596879 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-eefc-account-create-update-h8n6c"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.598445 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-eefc-account-create-update-h8n6c" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.605294 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.607957 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-nqj2r"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.646218 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-eefc-account-create-update-h8n6c"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.692543 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tctdk\" (UniqueName: \"kubernetes.io/projected/2039a569-0bc4-49a4-9e82-08964729dc7b-kube-api-access-tctdk\") pod \"cloudkitty-db-create-nqj2r\" (UID: \"2039a569-0bc4-49a4-9e82-08964729dc7b\") " pod="openstack/cloudkitty-db-create-nqj2r" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.692617 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-operator-scripts\") pod \"cinder-eefc-account-create-update-h8n6c\" (UID: \"c78d063e-7cd7-4b41-b148-1a7f9a3f9914\") " pod="openstack/cinder-eefc-account-create-update-h8n6c" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.692709 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcg74\" (UniqueName: \"kubernetes.io/projected/fe445148-46c0-4e8c-844a-51a5ce323370-kube-api-access-jcg74\") pod \"cinder-db-create-67cbt\" (UID: \"fe445148-46c0-4e8c-844a-51a5ce323370\") " pod="openstack/cinder-db-create-67cbt" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.692744 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe445148-46c0-4e8c-844a-51a5ce323370-operator-scripts\") pod \"cinder-db-create-67cbt\" (UID: \"fe445148-46c0-4e8c-844a-51a5ce323370\") " pod="openstack/cinder-db-create-67cbt" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.692763 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dqjg\" (UniqueName: \"kubernetes.io/projected/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-kube-api-access-2dqjg\") pod \"cinder-eefc-account-create-update-h8n6c\" (UID: \"c78d063e-7cd7-4b41-b148-1a7f9a3f9914\") " pod="openstack/cinder-eefc-account-create-update-h8n6c" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.692810 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2039a569-0bc4-49a4-9e82-08964729dc7b-operator-scripts\") pod \"cloudkitty-db-create-nqj2r\" (UID: \"2039a569-0bc4-49a4-9e82-08964729dc7b\") " pod="openstack/cloudkitty-db-create-nqj2r" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.693929 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe445148-46c0-4e8c-844a-51a5ce323370-operator-scripts\") pod \"cinder-db-create-67cbt\" (UID: \"fe445148-46c0-4e8c-844a-51a5ce323370\") " pod="openstack/cinder-db-create-67cbt" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.695818 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-89e2-account-create-update-7656w"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.697213 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-89e2-account-create-update-7656w" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.700879 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.712335 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-89e2-account-create-update-7656w"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.730970 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcg74\" (UniqueName: \"kubernetes.io/projected/fe445148-46c0-4e8c-844a-51a5ce323370-kube-api-access-jcg74\") pod \"cinder-db-create-67cbt\" (UID: \"fe445148-46c0-4e8c-844a-51a5ce323370\") " pod="openstack/cinder-db-create-67cbt" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.767489 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-ws9fr"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.768885 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.772879 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.773008 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qhj8b" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.773186 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.773292 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.781714 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ws9fr"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.793870 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tctdk\" (UniqueName: \"kubernetes.io/projected/2039a569-0bc4-49a4-9e82-08964729dc7b-kube-api-access-tctdk\") pod \"cloudkitty-db-create-nqj2r\" (UID: \"2039a569-0bc4-49a4-9e82-08964729dc7b\") " pod="openstack/cloudkitty-db-create-nqj2r" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.793918 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-operator-scripts\") pod \"cinder-eefc-account-create-update-h8n6c\" (UID: \"c78d063e-7cd7-4b41-b148-1a7f9a3f9914\") " pod="openstack/cinder-eefc-account-create-update-h8n6c" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.793944 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25905c52-4074-40d4-826f-ef89353eeaa6-operator-scripts\") pod \"cloudkitty-89e2-account-create-update-7656w\" (UID: \"25905c52-4074-40d4-826f-ef89353eeaa6\") " pod="openstack/cloudkitty-89e2-account-create-update-7656w" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.794053 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dqjg\" (UniqueName: \"kubernetes.io/projected/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-kube-api-access-2dqjg\") pod \"cinder-eefc-account-create-update-h8n6c\" (UID: \"c78d063e-7cd7-4b41-b148-1a7f9a3f9914\") " pod="openstack/cinder-eefc-account-create-update-h8n6c" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.794103 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb5b6\" (UniqueName: \"kubernetes.io/projected/25905c52-4074-40d4-826f-ef89353eeaa6-kube-api-access-xb5b6\") pod \"cloudkitty-89e2-account-create-update-7656w\" (UID: \"25905c52-4074-40d4-826f-ef89353eeaa6\") " pod="openstack/cloudkitty-89e2-account-create-update-7656w" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.794133 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2039a569-0bc4-49a4-9e82-08964729dc7b-operator-scripts\") pod \"cloudkitty-db-create-nqj2r\" (UID: \"2039a569-0bc4-49a4-9e82-08964729dc7b\") " pod="openstack/cloudkitty-db-create-nqj2r" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.794925 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2039a569-0bc4-49a4-9e82-08964729dc7b-operator-scripts\") pod \"cloudkitty-db-create-nqj2r\" (UID: \"2039a569-0bc4-49a4-9e82-08964729dc7b\") " pod="openstack/cloudkitty-db-create-nqj2r" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.795034 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-operator-scripts\") pod \"cinder-eefc-account-create-update-h8n6c\" (UID: \"c78d063e-7cd7-4b41-b148-1a7f9a3f9914\") " pod="openstack/cinder-eefc-account-create-update-h8n6c" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.809656 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-67cbt" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.815023 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dqjg\" (UniqueName: \"kubernetes.io/projected/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-kube-api-access-2dqjg\") pod \"cinder-eefc-account-create-update-h8n6c\" (UID: \"c78d063e-7cd7-4b41-b148-1a7f9a3f9914\") " pod="openstack/cinder-eefc-account-create-update-h8n6c" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.815446 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tctdk\" (UniqueName: \"kubernetes.io/projected/2039a569-0bc4-49a4-9e82-08964729dc7b-kube-api-access-tctdk\") pod \"cloudkitty-db-create-nqj2r\" (UID: \"2039a569-0bc4-49a4-9e82-08964729dc7b\") " pod="openstack/cloudkitty-db-create-nqj2r" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.871946 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-j7hfg"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.872988 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j7hfg" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.887898 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-j7hfg"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.901469 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb5b6\" (UniqueName: \"kubernetes.io/projected/25905c52-4074-40d4-826f-ef89353eeaa6-kube-api-access-xb5b6\") pod \"cloudkitty-89e2-account-create-update-7656w\" (UID: \"25905c52-4074-40d4-826f-ef89353eeaa6\") " pod="openstack/cloudkitty-89e2-account-create-update-7656w" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.901986 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-combined-ca-bundle\") pod \"keystone-db-sync-ws9fr\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.902161 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-config-data\") pod \"keystone-db-sync-ws9fr\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.902333 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zlfm\" (UniqueName: \"kubernetes.io/projected/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-kube-api-access-9zlfm\") pod \"keystone-db-sync-ws9fr\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.902388 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25905c52-4074-40d4-826f-ef89353eeaa6-operator-scripts\") pod \"cloudkitty-89e2-account-create-update-7656w\" (UID: \"25905c52-4074-40d4-826f-ef89353eeaa6\") " pod="openstack/cloudkitty-89e2-account-create-update-7656w" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.903319 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25905c52-4074-40d4-826f-ef89353eeaa6-operator-scripts\") pod \"cloudkitty-89e2-account-create-update-7656w\" (UID: \"25905c52-4074-40d4-826f-ef89353eeaa6\") " pod="openstack/cloudkitty-89e2-account-create-update-7656w" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.906204 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-nqj2r" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.917722 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-eefc-account-create-update-h8n6c" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.923886 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb5b6\" (UniqueName: \"kubernetes.io/projected/25905c52-4074-40d4-826f-ef89353eeaa6-kube-api-access-xb5b6\") pod \"cloudkitty-89e2-account-create-update-7656w\" (UID: \"25905c52-4074-40d4-826f-ef89353eeaa6\") " pod="openstack/cloudkitty-89e2-account-create-update-7656w" Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.991043 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-7kcsc"] Feb 19 19:37:44 crc kubenswrapper[4722]: I0219 19:37:44.994937 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7kcsc" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.004114 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-config-data\") pod \"keystone-db-sync-ws9fr\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.004238 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zlfm\" (UniqueName: \"kubernetes.io/projected/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-kube-api-access-9zlfm\") pod \"keystone-db-sync-ws9fr\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.004332 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-operator-scripts\") pod \"barbican-db-create-j7hfg\" (UID: \"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92\") " pod="openstack/barbican-db-create-j7hfg" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.004352 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tks5b\" (UniqueName: \"kubernetes.io/projected/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-kube-api-access-tks5b\") pod \"barbican-db-create-j7hfg\" (UID: \"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92\") " pod="openstack/barbican-db-create-j7hfg" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.004403 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-combined-ca-bundle\") pod \"keystone-db-sync-ws9fr\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.012843 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-89e2-account-create-update-7656w" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.013716 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-config-data\") pod \"keystone-db-sync-ws9fr\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.013864 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-combined-ca-bundle\") pod \"keystone-db-sync-ws9fr\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.019239 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-36cd-account-create-update-r5498"] Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.020420 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-36cd-account-create-update-r5498" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.023523 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.028905 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7kcsc"] Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.032693 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zlfm\" (UniqueName: \"kubernetes.io/projected/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-kube-api-access-9zlfm\") pod \"keystone-db-sync-ws9fr\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.042680 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-36cd-account-create-update-r5498"] Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.096755 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.102017 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0f99-account-create-update-fflhf"] Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.103315 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0f99-account-create-update-fflhf" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.105489 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.105636 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-operator-scripts\") pod \"barbican-db-create-j7hfg\" (UID: \"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92\") " pod="openstack/barbican-db-create-j7hfg" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.105721 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tks5b\" (UniqueName: \"kubernetes.io/projected/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-kube-api-access-tks5b\") pod \"barbican-db-create-j7hfg\" (UID: \"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92\") " pod="openstack/barbican-db-create-j7hfg" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.105854 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5778eec-eb7e-4137-85bd-761ac78b9fd7-operator-scripts\") pod \"neutron-db-create-7kcsc\" (UID: \"d5778eec-eb7e-4137-85bd-761ac78b9fd7\") " pod="openstack/neutron-db-create-7kcsc" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.105979 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6ndf\" (UniqueName: \"kubernetes.io/projected/d5778eec-eb7e-4137-85bd-761ac78b9fd7-kube-api-access-r6ndf\") pod \"neutron-db-create-7kcsc\" (UID: \"d5778eec-eb7e-4137-85bd-761ac78b9fd7\") " pod="openstack/neutron-db-create-7kcsc" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.107240 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-operator-scripts\") pod \"barbican-db-create-j7hfg\" (UID: \"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92\") " pod="openstack/barbican-db-create-j7hfg" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.116532 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0f99-account-create-update-fflhf"] Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.129949 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tks5b\" (UniqueName: \"kubernetes.io/projected/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-kube-api-access-tks5b\") pod \"barbican-db-create-j7hfg\" (UID: \"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92\") " pod="openstack/barbican-db-create-j7hfg" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.202541 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j7hfg" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.208246 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8d81d51-f4b7-4dec-9548-982de19b4742-operator-scripts\") pod \"neutron-36cd-account-create-update-r5498\" (UID: \"a8d81d51-f4b7-4dec-9548-982de19b4742\") " pod="openstack/neutron-36cd-account-create-update-r5498" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.208323 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6rpc\" (UniqueName: \"kubernetes.io/projected/217ea569-e058-4f21-bbb7-d2f2648375eb-kube-api-access-k6rpc\") pod \"barbican-0f99-account-create-update-fflhf\" (UID: \"217ea569-e058-4f21-bbb7-d2f2648375eb\") " pod="openstack/barbican-0f99-account-create-update-fflhf" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.208375 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5778eec-eb7e-4137-85bd-761ac78b9fd7-operator-scripts\") pod \"neutron-db-create-7kcsc\" (UID: \"d5778eec-eb7e-4137-85bd-761ac78b9fd7\") " pod="openstack/neutron-db-create-7kcsc" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.208429 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnvz2\" (UniqueName: \"kubernetes.io/projected/a8d81d51-f4b7-4dec-9548-982de19b4742-kube-api-access-vnvz2\") pod \"neutron-36cd-account-create-update-r5498\" (UID: \"a8d81d51-f4b7-4dec-9548-982de19b4742\") " pod="openstack/neutron-36cd-account-create-update-r5498" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.208488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6ndf\" (UniqueName: \"kubernetes.io/projected/d5778eec-eb7e-4137-85bd-761ac78b9fd7-kube-api-access-r6ndf\") pod \"neutron-db-create-7kcsc\" (UID: \"d5778eec-eb7e-4137-85bd-761ac78b9fd7\") " pod="openstack/neutron-db-create-7kcsc" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.208613 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/217ea569-e058-4f21-bbb7-d2f2648375eb-operator-scripts\") pod \"barbican-0f99-account-create-update-fflhf\" (UID: \"217ea569-e058-4f21-bbb7-d2f2648375eb\") " pod="openstack/barbican-0f99-account-create-update-fflhf" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.209215 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5778eec-eb7e-4137-85bd-761ac78b9fd7-operator-scripts\") pod \"neutron-db-create-7kcsc\" (UID: \"d5778eec-eb7e-4137-85bd-761ac78b9fd7\") " pod="openstack/neutron-db-create-7kcsc" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.226278 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6ndf\" (UniqueName: \"kubernetes.io/projected/d5778eec-eb7e-4137-85bd-761ac78b9fd7-kube-api-access-r6ndf\") pod \"neutron-db-create-7kcsc\" (UID: \"d5778eec-eb7e-4137-85bd-761ac78b9fd7\") " pod="openstack/neutron-db-create-7kcsc" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.310676 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/217ea569-e058-4f21-bbb7-d2f2648375eb-operator-scripts\") pod \"barbican-0f99-account-create-update-fflhf\" (UID: \"217ea569-e058-4f21-bbb7-d2f2648375eb\") " pod="openstack/barbican-0f99-account-create-update-fflhf" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.310801 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8d81d51-f4b7-4dec-9548-982de19b4742-operator-scripts\") pod \"neutron-36cd-account-create-update-r5498\" (UID: \"a8d81d51-f4b7-4dec-9548-982de19b4742\") " pod="openstack/neutron-36cd-account-create-update-r5498" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.310844 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6rpc\" (UniqueName: \"kubernetes.io/projected/217ea569-e058-4f21-bbb7-d2f2648375eb-kube-api-access-k6rpc\") pod \"barbican-0f99-account-create-update-fflhf\" (UID: \"217ea569-e058-4f21-bbb7-d2f2648375eb\") " pod="openstack/barbican-0f99-account-create-update-fflhf" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.310884 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnvz2\" (UniqueName: \"kubernetes.io/projected/a8d81d51-f4b7-4dec-9548-982de19b4742-kube-api-access-vnvz2\") pod \"neutron-36cd-account-create-update-r5498\" (UID: \"a8d81d51-f4b7-4dec-9548-982de19b4742\") " pod="openstack/neutron-36cd-account-create-update-r5498" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.311992 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/217ea569-e058-4f21-bbb7-d2f2648375eb-operator-scripts\") pod \"barbican-0f99-account-create-update-fflhf\" (UID: \"217ea569-e058-4f21-bbb7-d2f2648375eb\") " pod="openstack/barbican-0f99-account-create-update-fflhf" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.312651 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8d81d51-f4b7-4dec-9548-982de19b4742-operator-scripts\") pod \"neutron-36cd-account-create-update-r5498\" (UID: \"a8d81d51-f4b7-4dec-9548-982de19b4742\") " pod="openstack/neutron-36cd-account-create-update-r5498" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.326530 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnvz2\" (UniqueName: \"kubernetes.io/projected/a8d81d51-f4b7-4dec-9548-982de19b4742-kube-api-access-vnvz2\") pod \"neutron-36cd-account-create-update-r5498\" (UID: \"a8d81d51-f4b7-4dec-9548-982de19b4742\") " pod="openstack/neutron-36cd-account-create-update-r5498" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.329022 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6rpc\" (UniqueName: \"kubernetes.io/projected/217ea569-e058-4f21-bbb7-d2f2648375eb-kube-api-access-k6rpc\") pod \"barbican-0f99-account-create-update-fflhf\" (UID: \"217ea569-e058-4f21-bbb7-d2f2648375eb\") " pod="openstack/barbican-0f99-account-create-update-fflhf" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.363166 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7kcsc" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.375078 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-36cd-account-create-update-r5498" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.421815 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0f99-account-create-update-fflhf" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.942923 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.946270 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:45 crc kubenswrapper[4722]: I0219 19:37:45.979941 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:46 crc kubenswrapper[4722]: I0219 19:37:46.942944 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.019962 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-q5fhk" event={"ID":"c81edb08-7ac8-4cfc-abce-5895b8e7b59b","Type":"ContainerDied","Data":"54695cf999be7e298e8d5f33dab8be8887de88cf6efd1f9abc7e57d8db760924"} Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.020003 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-q5fhk" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.020021 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54695cf999be7e298e8d5f33dab8be8887de88cf6efd1f9abc7e57d8db760924" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.048839 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-swiftconf\") pod \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.049020 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnhgw\" (UniqueName: \"kubernetes.io/projected/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-kube-api-access-hnhgw\") pod \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.049092 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-dispersionconf\") pod \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.049138 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-ring-data-devices\") pod \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.049201 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-combined-ca-bundle\") pod \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.049253 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-etc-swift\") pod \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.049344 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-scripts\") pod \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\" (UID: \"c81edb08-7ac8-4cfc-abce-5895b8e7b59b\") " Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.050128 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c81edb08-7ac8-4cfc-abce-5895b8e7b59b" (UID: "c81edb08-7ac8-4cfc-abce-5895b8e7b59b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.050488 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c81edb08-7ac8-4cfc-abce-5895b8e7b59b" (UID: "c81edb08-7ac8-4cfc-abce-5895b8e7b59b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.056586 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-kube-api-access-hnhgw" (OuterVolumeSpecName: "kube-api-access-hnhgw") pod "c81edb08-7ac8-4cfc-abce-5895b8e7b59b" (UID: "c81edb08-7ac8-4cfc-abce-5895b8e7b59b"). InnerVolumeSpecName "kube-api-access-hnhgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.063285 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c81edb08-7ac8-4cfc-abce-5895b8e7b59b" (UID: "c81edb08-7ac8-4cfc-abce-5895b8e7b59b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.106322 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-scripts" (OuterVolumeSpecName: "scripts") pod "c81edb08-7ac8-4cfc-abce-5895b8e7b59b" (UID: "c81edb08-7ac8-4cfc-abce-5895b8e7b59b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.122903 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c81edb08-7ac8-4cfc-abce-5895b8e7b59b" (UID: "c81edb08-7ac8-4cfc-abce-5895b8e7b59b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.145653 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c81edb08-7ac8-4cfc-abce-5895b8e7b59b" (UID: "c81edb08-7ac8-4cfc-abce-5895b8e7b59b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.154305 4722 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.154343 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.154352 4722 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.154362 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnhgw\" (UniqueName: \"kubernetes.io/projected/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-kube-api-access-hnhgw\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.154372 4722 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.154382 4722 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.154400 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c81edb08-7ac8-4cfc-abce-5895b8e7b59b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:47 crc kubenswrapper[4722]: E0219 19:37:47.558269 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc81edb08_7ac8_4cfc_abce_5895b8e7b59b.slice\": RecentStats: unable to find data in memory cache]" Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.741292 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-j7hfg"] Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.753234 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-nqj2r"] Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.800668 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-36cd-account-create-update-r5498"] Feb 19 19:37:47 crc kubenswrapper[4722]: I0219 19:37:47.814113 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ws9fr"] Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.030336 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ws9fr" event={"ID":"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb","Type":"ContainerStarted","Data":"bab4e0dcd47bed11b26a97a238fcb572193e857fe8e5670dfa59d566460783b1"} Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.031813 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-nqj2r" event={"ID":"2039a569-0bc4-49a4-9e82-08964729dc7b","Type":"ContainerStarted","Data":"ddb368090d84549e8613e6a8bf09662a248b3cdae6696eb51f4b8a9270abb3bd"} Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.033440 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-36cd-account-create-update-r5498" event={"ID":"a8d81d51-f4b7-4dec-9548-982de19b4742","Type":"ContainerStarted","Data":"edc9e09b1a3536dad44773c79d12736ed4976a1f27a0aa500ba378c707d81315"} Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.034937 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8fd9q" event={"ID":"619d59b3-6514-4648-9007-6e9ce3427c3a","Type":"ContainerStarted","Data":"6b62751b62c97e1ba880132d8b9f91b0968a628ff8eb98b71cf2b1fff30986bd"} Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.038499 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-j7hfg" event={"ID":"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92","Type":"ContainerStarted","Data":"42d0c57b026c599554638595f7678853fcba7c141ed4152a46e9c34dcadec9ce"} Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.038546 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-j7hfg" event={"ID":"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92","Type":"ContainerStarted","Data":"51c1d600bd28ddd30eb97781aa9e73764604da69509764e840b9fa1cd8a6698a"} Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.054662 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-8fd9q" podStartSLOduration=2.7591183790000002 podStartE2EDuration="16.05463495s" podCreationTimestamp="2026-02-19 19:37:32 +0000 UTC" firstStartedPulling="2026-02-19 19:37:33.835691286 +0000 UTC m=+1153.448041610" lastFinishedPulling="2026-02-19 19:37:47.131207867 +0000 UTC m=+1166.743558181" observedRunningTime="2026-02-19 19:37:48.045777295 +0000 UTC m=+1167.658127619" watchObservedRunningTime="2026-02-19 19:37:48.05463495 +0000 UTC m=+1167.666985274" Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.065604 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-j7hfg" podStartSLOduration=4.065583101 podStartE2EDuration="4.065583101s" podCreationTimestamp="2026-02-19 19:37:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:37:48.0620354 +0000 UTC m=+1167.674385724" watchObservedRunningTime="2026-02-19 19:37:48.065583101 +0000 UTC m=+1167.677933425" Feb 19 19:37:48 crc kubenswrapper[4722]: W0219 19:37:48.112082 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe445148_46c0_4e8c_844a_51a5ce323370.slice/crio-8955c30984ebf2c5ab888f047266d2a4993ef3ec32a630f0d78d6289e3b1e31b WatchSource:0}: Error finding container 8955c30984ebf2c5ab888f047266d2a4993ef3ec32a630f0d78d6289e3b1e31b: Status 404 returned error can't find the container with id 8955c30984ebf2c5ab888f047266d2a4993ef3ec32a630f0d78d6289e3b1e31b Feb 19 19:37:48 crc kubenswrapper[4722]: W0219 19:37:48.112640 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc78d063e_7cd7_4b41_b148_1a7f9a3f9914.slice/crio-fab07f6602d82c70ee914ff73e24097daec979f090b8fe1419a71597a96b49cd WatchSource:0}: Error finding container fab07f6602d82c70ee914ff73e24097daec979f090b8fe1419a71597a96b49cd: Status 404 returned error can't find the container with id fab07f6602d82c70ee914ff73e24097daec979f090b8fe1419a71597a96b49cd Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.117652 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-eefc-account-create-update-h8n6c"] Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.128257 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-67cbt"] Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.140842 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-89e2-account-create-update-7656w"] Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.153032 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-7kcsc"] Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.168310 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0f99-account-create-update-fflhf"] Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.448037 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.448750 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="prometheus" containerID="cri-o://572f93c668d26d7ec11607aad487fa047b3c482800703fea034a7c2c7174262f" gracePeriod=600 Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.449213 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="thanos-sidecar" containerID="cri-o://efe902350ad0886c73508abcb086ad6fc6e169270b01937b8957c668bd35bc1d" gracePeriod=600 Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.449283 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="config-reloader" containerID="cri-o://c2baa075267fa149454aabf4b426a4fea2dd3c3a6aa19421e7bc91c894e1e821" gracePeriod=600 Feb 19 19:37:48 crc kubenswrapper[4722]: I0219 19:37:48.523525 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6tmmr" podUID="293cde43-7bcf-4638-a080-badb26c81138" containerName="ovn-controller" probeResult="failure" output=< Feb 19 19:37:48 crc kubenswrapper[4722]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 19:37:48 crc kubenswrapper[4722]: > Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.055287 4722 generic.go:334] "Generic (PLEG): container finished" podID="2039a569-0bc4-49a4-9e82-08964729dc7b" containerID="a4f4b237835194ac1fcedd350c7532fc74f42e672c498f5c9cea05272f6986a0" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.055481 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-nqj2r" event={"ID":"2039a569-0bc4-49a4-9e82-08964729dc7b","Type":"ContainerDied","Data":"a4f4b237835194ac1fcedd350c7532fc74f42e672c498f5c9cea05272f6986a0"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.059364 4722 generic.go:334] "Generic (PLEG): container finished" podID="a8d81d51-f4b7-4dec-9548-982de19b4742" containerID="bb275fbcbbe35a94955e26075778ab6128134f99af8b8d18b788e7b11aac61c6" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.059541 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-36cd-account-create-update-r5498" event={"ID":"a8d81d51-f4b7-4dec-9548-982de19b4742","Type":"ContainerDied","Data":"bb275fbcbbe35a94955e26075778ab6128134f99af8b8d18b788e7b11aac61c6"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.064674 4722 generic.go:334] "Generic (PLEG): container finished" podID="44a49a3a-3b7e-4b75-aae8-ba236c1bfc92" containerID="42d0c57b026c599554638595f7678853fcba7c141ed4152a46e9c34dcadec9ce" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.064775 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-j7hfg" event={"ID":"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92","Type":"ContainerDied","Data":"42d0c57b026c599554638595f7678853fcba7c141ed4152a46e9c34dcadec9ce"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.067835 4722 generic.go:334] "Generic (PLEG): container finished" podID="25905c52-4074-40d4-826f-ef89353eeaa6" containerID="6c2e2442beaae76dbd599637b272c7eae6a58710a3bb17eed3e61507df9ea9e0" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.067915 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-89e2-account-create-update-7656w" event={"ID":"25905c52-4074-40d4-826f-ef89353eeaa6","Type":"ContainerDied","Data":"6c2e2442beaae76dbd599637b272c7eae6a58710a3bb17eed3e61507df9ea9e0"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.067938 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-89e2-account-create-update-7656w" event={"ID":"25905c52-4074-40d4-826f-ef89353eeaa6","Type":"ContainerStarted","Data":"1a6bf93e45faaf4db27d8e2c20d1f7fc553e5da9aa844b25ea2e3a9760a7bce6"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.078259 4722 generic.go:334] "Generic (PLEG): container finished" podID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerID="efe902350ad0886c73508abcb086ad6fc6e169270b01937b8957c668bd35bc1d" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.078293 4722 generic.go:334] "Generic (PLEG): container finished" podID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerID="c2baa075267fa149454aabf4b426a4fea2dd3c3a6aa19421e7bc91c894e1e821" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.078308 4722 generic.go:334] "Generic (PLEG): container finished" podID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerID="572f93c668d26d7ec11607aad487fa047b3c482800703fea034a7c2c7174262f" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.100506 4722 generic.go:334] "Generic (PLEG): container finished" podID="217ea569-e058-4f21-bbb7-d2f2648375eb" containerID="6d49fd861306d1a47364956e09d02157a9618a565198ef080d63694bf02fdc31" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.105338 4722 generic.go:334] "Generic (PLEG): container finished" podID="d5778eec-eb7e-4137-85bd-761ac78b9fd7" containerID="8c73c8e1b7d4896f7ab7a5272b3c22c63e7d90ad3033ca9be834b667cd882b7f" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.105768 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a08df2e8-3f03-4e9c-91cf-2890026b9d76","Type":"ContainerDied","Data":"efe902350ad0886c73508abcb086ad6fc6e169270b01937b8957c668bd35bc1d"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.105863 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a08df2e8-3f03-4e9c-91cf-2890026b9d76","Type":"ContainerDied","Data":"c2baa075267fa149454aabf4b426a4fea2dd3c3a6aa19421e7bc91c894e1e821"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.105884 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a08df2e8-3f03-4e9c-91cf-2890026b9d76","Type":"ContainerDied","Data":"572f93c668d26d7ec11607aad487fa047b3c482800703fea034a7c2c7174262f"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.105895 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0f99-account-create-update-fflhf" event={"ID":"217ea569-e058-4f21-bbb7-d2f2648375eb","Type":"ContainerDied","Data":"6d49fd861306d1a47364956e09d02157a9618a565198ef080d63694bf02fdc31"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.105908 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0f99-account-create-update-fflhf" event={"ID":"217ea569-e058-4f21-bbb7-d2f2648375eb","Type":"ContainerStarted","Data":"7430b538256ae44d6944b8b2a907a2e6c7bd0b62c3823fced29b96d85a5d5b4f"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.105918 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7kcsc" event={"ID":"d5778eec-eb7e-4137-85bd-761ac78b9fd7","Type":"ContainerDied","Data":"8c73c8e1b7d4896f7ab7a5272b3c22c63e7d90ad3033ca9be834b667cd882b7f"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.105930 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7kcsc" event={"ID":"d5778eec-eb7e-4137-85bd-761ac78b9fd7","Type":"ContainerStarted","Data":"414b6e4e860136d1c415be1caf745a8eda79544b985daf7b493e0eacf866bbda"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.112646 4722 generic.go:334] "Generic (PLEG): container finished" podID="fe445148-46c0-4e8c-844a-51a5ce323370" containerID="bf1fddeb0ef2831ba2e02a1aa709a530121f690fbf768791dd2408b9c18e9009" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.112819 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-67cbt" event={"ID":"fe445148-46c0-4e8c-844a-51a5ce323370","Type":"ContainerDied","Data":"bf1fddeb0ef2831ba2e02a1aa709a530121f690fbf768791dd2408b9c18e9009"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.112856 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-67cbt" event={"ID":"fe445148-46c0-4e8c-844a-51a5ce323370","Type":"ContainerStarted","Data":"8955c30984ebf2c5ab888f047266d2a4993ef3ec32a630f0d78d6289e3b1e31b"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.115121 4722 generic.go:334] "Generic (PLEG): container finished" podID="c78d063e-7cd7-4b41-b148-1a7f9a3f9914" containerID="24deafb2187b5509b9a503b5cde68eab414e437eef2f36f8141214811c39e398" exitCode=0 Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.115201 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-eefc-account-create-update-h8n6c" event={"ID":"c78d063e-7cd7-4b41-b148-1a7f9a3f9914","Type":"ContainerDied","Data":"24deafb2187b5509b9a503b5cde68eab414e437eef2f36f8141214811c39e398"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.115228 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-eefc-account-create-update-h8n6c" event={"ID":"c78d063e-7cd7-4b41-b148-1a7f9a3f9914","Type":"ContainerStarted","Data":"fab07f6602d82c70ee914ff73e24097daec979f090b8fe1419a71597a96b49cd"} Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.546549 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.704340 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config-out\") pod \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.704399 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-0\") pod \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.704448 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-web-config\") pod \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.704585 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-thanos-prometheus-http-client-file\") pod \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.704637 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-2\") pod \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.704663 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrjf5\" (UniqueName: \"kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-kube-api-access-nrjf5\") pod \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.704714 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-1\") pod \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.704780 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config\") pod \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.704888 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-tls-assets\") pod \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.705036 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") pod \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\" (UID: \"a08df2e8-3f03-4e9c-91cf-2890026b9d76\") " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.705473 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "a08df2e8-3f03-4e9c-91cf-2890026b9d76" (UID: "a08df2e8-3f03-4e9c-91cf-2890026b9d76"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.705693 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "a08df2e8-3f03-4e9c-91cf-2890026b9d76" (UID: "a08df2e8-3f03-4e9c-91cf-2890026b9d76"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.707916 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "a08df2e8-3f03-4e9c-91cf-2890026b9d76" (UID: "a08df2e8-3f03-4e9c-91cf-2890026b9d76"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.713297 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config-out" (OuterVolumeSpecName: "config-out") pod "a08df2e8-3f03-4e9c-91cf-2890026b9d76" (UID: "a08df2e8-3f03-4e9c-91cf-2890026b9d76"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.714073 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "a08df2e8-3f03-4e9c-91cf-2890026b9d76" (UID: "a08df2e8-3f03-4e9c-91cf-2890026b9d76"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.718844 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config" (OuterVolumeSpecName: "config") pod "a08df2e8-3f03-4e9c-91cf-2890026b9d76" (UID: "a08df2e8-3f03-4e9c-91cf-2890026b9d76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.719363 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "a08df2e8-3f03-4e9c-91cf-2890026b9d76" (UID: "a08df2e8-3f03-4e9c-91cf-2890026b9d76"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.731401 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-kube-api-access-nrjf5" (OuterVolumeSpecName: "kube-api-access-nrjf5") pod "a08df2e8-3f03-4e9c-91cf-2890026b9d76" (UID: "a08df2e8-3f03-4e9c-91cf-2890026b9d76"). InnerVolumeSpecName "kube-api-access-nrjf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.738953 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "a08df2e8-3f03-4e9c-91cf-2890026b9d76" (UID: "a08df2e8-3f03-4e9c-91cf-2890026b9d76"). InnerVolumeSpecName "pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.779367 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-web-config" (OuterVolumeSpecName: "web-config") pod "a08df2e8-3f03-4e9c-91cf-2890026b9d76" (UID: "a08df2e8-3f03-4e9c-91cf-2890026b9d76"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.810588 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") on node \"crc\" " Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.810630 4722 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config-out\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.810645 4722 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.810666 4722 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-web-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.810680 4722 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.810690 4722 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.810703 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrjf5\" (UniqueName: \"kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-kube-api-access-nrjf5\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.810715 4722 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/a08df2e8-3f03-4e9c-91cf-2890026b9d76-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.810728 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a08df2e8-3f03-4e9c-91cf-2890026b9d76-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.810737 4722 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a08df2e8-3f03-4e9c-91cf-2890026b9d76-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.832431 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.832688 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3") on node "crc" Feb 19 19:37:49 crc kubenswrapper[4722]: I0219 19:37:49.913094 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.129498 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a08df2e8-3f03-4e9c-91cf-2890026b9d76","Type":"ContainerDied","Data":"3162a6ec047952b568aaa8c73c253863e5256bf0afc61efa90f1b0efd37039e5"} Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.129597 4722 scope.go:117] "RemoveContainer" containerID="efe902350ad0886c73508abcb086ad6fc6e169270b01937b8957c668bd35bc1d" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.129791 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.198209 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.216123 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.251244 4722 scope.go:117] "RemoveContainer" containerID="c2baa075267fa149454aabf4b426a4fea2dd3c3a6aa19421e7bc91c894e1e821" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.251382 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:37:50 crc kubenswrapper[4722]: E0219 19:37:50.252686 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="prometheus" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.252703 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="prometheus" Feb 19 19:37:50 crc kubenswrapper[4722]: E0219 19:37:50.252716 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="thanos-sidecar" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.252724 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="thanos-sidecar" Feb 19 19:37:50 crc kubenswrapper[4722]: E0219 19:37:50.252736 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c81edb08-7ac8-4cfc-abce-5895b8e7b59b" containerName="swift-ring-rebalance" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.252743 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c81edb08-7ac8-4cfc-abce-5895b8e7b59b" containerName="swift-ring-rebalance" Feb 19 19:37:50 crc kubenswrapper[4722]: E0219 19:37:50.252757 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="config-reloader" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.252763 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="config-reloader" Feb 19 19:37:50 crc kubenswrapper[4722]: E0219 19:37:50.252772 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="init-config-reloader" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.252777 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="init-config-reloader" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.252957 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c81edb08-7ac8-4cfc-abce-5895b8e7b59b" containerName="swift-ring-rebalance" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.252991 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="thanos-sidecar" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.253007 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="config-reloader" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.253026 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" containerName="prometheus" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.255160 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.261355 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.261558 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.261661 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.261918 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.262046 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.262163 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.270112 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.284657 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-kl9sq" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.284970 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.289629 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.345987 4722 scope.go:117] "RemoveContainer" containerID="572f93c668d26d7ec11607aad487fa047b3c482800703fea034a7c2c7174262f" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.347893 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.347934 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.347980 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.348009 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m6hm\" (UniqueName: \"kubernetes.io/projected/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-kube-api-access-9m6hm\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.348039 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.348079 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.348094 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.348114 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.348182 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.348228 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.348251 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-config\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.348271 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.348313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.386449 4722 scope.go:117] "RemoveContainer" containerID="a57f0a1057a7622bf6cd5a97f7d1c754dd0d44986fc9d7f455890c4bc7caac51" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.450341 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.450430 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.450540 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.450571 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-config\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.450616 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.450661 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.450707 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.450729 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.450786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.451196 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m6hm\" (UniqueName: \"kubernetes.io/projected/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-kube-api-access-9m6hm\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.451223 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.451296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.451314 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.451809 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.455810 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.457136 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.458364 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.459048 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.462616 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.464388 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.464430 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/991d7114ade43d3df67520db88811056b16c48c5086e58d4724863cd9821be9f/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.465508 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-config\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:50 crc kubenswrapper[4722]: I0219 19:37:50.467927 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.470794 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.473588 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.473935 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.494065 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m6hm\" (UniqueName: \"kubernetes.io/projected/e3f1f109-9754-4525-b5e8-dbf86ba52f2b-kube-api-access-9m6hm\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.575834 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd94b003-7bc7-41a0-bbad-6ff5f9869ed3\") pod \"prometheus-metric-storage-0\" (UID: \"e3f1f109-9754-4525-b5e8-dbf86ba52f2b\") " pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.587722 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.649794 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j7hfg" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.763749 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-operator-scripts\") pod \"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92\" (UID: \"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92\") " Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.763998 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tks5b\" (UniqueName: \"kubernetes.io/projected/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-kube-api-access-tks5b\") pod \"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92\" (UID: \"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92\") " Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.765824 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44a49a3a-3b7e-4b75-aae8-ba236c1bfc92" (UID: "44a49a3a-3b7e-4b75-aae8-ba236c1bfc92"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.770308 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-kube-api-access-tks5b" (OuterVolumeSpecName: "kube-api-access-tks5b") pod "44a49a3a-3b7e-4b75-aae8-ba236c1bfc92" (UID: "44a49a3a-3b7e-4b75-aae8-ba236c1bfc92"). InnerVolumeSpecName "kube-api-access-tks5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.866144 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tks5b\" (UniqueName: \"kubernetes.io/projected/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-kube-api-access-tks5b\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:50.866542 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:51.091880 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a08df2e8-3f03-4e9c-91cf-2890026b9d76" path="/var/lib/kubelet/pods/a08df2e8-3f03-4e9c-91cf-2890026b9d76/volumes" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:51.146075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-j7hfg" event={"ID":"44a49a3a-3b7e-4b75-aae8-ba236c1bfc92","Type":"ContainerDied","Data":"51c1d600bd28ddd30eb97781aa9e73764604da69509764e840b9fa1cd8a6698a"} Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:51.146107 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51c1d600bd28ddd30eb97781aa9e73764604da69509764e840b9fa1cd8a6698a" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:51.146107 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j7hfg" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:51.951625 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="a3fc19f1-6f9f-4f35-a391-1f6743480bd3" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:52.598969 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:52.605359 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/98dc74a5-9538-49e4-9dd0-eb2735f18d41-etc-swift\") pod \"swift-storage-0\" (UID: \"98dc74a5-9538-49e4-9dd0-eb2735f18d41\") " pod="openstack/swift-storage-0" Feb 19 19:37:52 crc kubenswrapper[4722]: I0219 19:37:52.658254 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.324644 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.510994 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6tmmr" podUID="293cde43-7bcf-4638-a080-badb26c81138" containerName="ovn-controller" probeResult="failure" output=< Feb 19 19:37:53 crc kubenswrapper[4722]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 19:37:53 crc kubenswrapper[4722]: > Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.535509 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.536942 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-fwvrs" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.784080 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6tmmr-config-x86ml"] Feb 19 19:37:53 crc kubenswrapper[4722]: E0219 19:37:53.784542 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a49a3a-3b7e-4b75-aae8-ba236c1bfc92" containerName="mariadb-database-create" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.784557 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a49a3a-3b7e-4b75-aae8-ba236c1bfc92" containerName="mariadb-database-create" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.784760 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a49a3a-3b7e-4b75-aae8-ba236c1bfc92" containerName="mariadb-database-create" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.785449 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.787927 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.796360 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6tmmr-config-x86ml"] Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.827203 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk9qr\" (UniqueName: \"kubernetes.io/projected/ff6d44c7-0792-4927-8214-a62a52211e92-kube-api-access-mk9qr\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.827262 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-log-ovn\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.827396 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-additional-scripts\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.827604 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run-ovn\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.827652 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-scripts\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.827704 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.929650 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk9qr\" (UniqueName: \"kubernetes.io/projected/ff6d44c7-0792-4927-8214-a62a52211e92-kube-api-access-mk9qr\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.929705 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-log-ovn\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.929758 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-additional-scripts\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.929853 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run-ovn\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.929887 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-scripts\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.929909 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.930102 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.930110 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-log-ovn\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.930103 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run-ovn\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.930657 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-additional-scripts\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.932024 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-scripts\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:53 crc kubenswrapper[4722]: I0219 19:37:53.949308 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk9qr\" (UniqueName: \"kubernetes.io/projected/ff6d44c7-0792-4927-8214-a62a52211e92-kube-api-access-mk9qr\") pod \"ovn-controller-6tmmr-config-x86ml\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:54 crc kubenswrapper[4722]: I0219 19:37:54.109783 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:55 crc kubenswrapper[4722]: I0219 19:37:55.992134 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7kcsc" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.019398 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-nqj2r" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.046483 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-eefc-account-create-update-h8n6c" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.064497 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-67cbt" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.075451 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2039a569-0bc4-49a4-9e82-08964729dc7b-operator-scripts\") pod \"2039a569-0bc4-49a4-9e82-08964729dc7b\" (UID: \"2039a569-0bc4-49a4-9e82-08964729dc7b\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.075896 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-89e2-account-create-update-7656w" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.076059 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6ndf\" (UniqueName: \"kubernetes.io/projected/d5778eec-eb7e-4137-85bd-761ac78b9fd7-kube-api-access-r6ndf\") pod \"d5778eec-eb7e-4137-85bd-761ac78b9fd7\" (UID: \"d5778eec-eb7e-4137-85bd-761ac78b9fd7\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.076804 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2039a569-0bc4-49a4-9e82-08964729dc7b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2039a569-0bc4-49a4-9e82-08964729dc7b" (UID: "2039a569-0bc4-49a4-9e82-08964729dc7b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.077145 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5778eec-eb7e-4137-85bd-761ac78b9fd7-operator-scripts\") pod \"d5778eec-eb7e-4137-85bd-761ac78b9fd7\" (UID: \"d5778eec-eb7e-4137-85bd-761ac78b9fd7\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.077229 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tctdk\" (UniqueName: \"kubernetes.io/projected/2039a569-0bc4-49a4-9e82-08964729dc7b-kube-api-access-tctdk\") pod \"2039a569-0bc4-49a4-9e82-08964729dc7b\" (UID: \"2039a569-0bc4-49a4-9e82-08964729dc7b\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.077853 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2039a569-0bc4-49a4-9e82-08964729dc7b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.079100 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5778eec-eb7e-4137-85bd-761ac78b9fd7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5778eec-eb7e-4137-85bd-761ac78b9fd7" (UID: "d5778eec-eb7e-4137-85bd-761ac78b9fd7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.090968 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2039a569-0bc4-49a4-9e82-08964729dc7b-kube-api-access-tctdk" (OuterVolumeSpecName: "kube-api-access-tctdk") pod "2039a569-0bc4-49a4-9e82-08964729dc7b" (UID: "2039a569-0bc4-49a4-9e82-08964729dc7b"). InnerVolumeSpecName "kube-api-access-tctdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.092321 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0f99-account-create-update-fflhf" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.102999 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5778eec-eb7e-4137-85bd-761ac78b9fd7-kube-api-access-r6ndf" (OuterVolumeSpecName: "kube-api-access-r6ndf") pod "d5778eec-eb7e-4137-85bd-761ac78b9fd7" (UID: "d5778eec-eb7e-4137-85bd-761ac78b9fd7"). InnerVolumeSpecName "kube-api-access-r6ndf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.103181 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-36cd-account-create-update-r5498" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.134773 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6tmmr-config-x86ml"] Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.178661 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-operator-scripts\") pod \"c78d063e-7cd7-4b41-b148-1a7f9a3f9914\" (UID: \"c78d063e-7cd7-4b41-b148-1a7f9a3f9914\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.178706 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8d81d51-f4b7-4dec-9548-982de19b4742-operator-scripts\") pod \"a8d81d51-f4b7-4dec-9548-982de19b4742\" (UID: \"a8d81d51-f4b7-4dec-9548-982de19b4742\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.178832 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dqjg\" (UniqueName: \"kubernetes.io/projected/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-kube-api-access-2dqjg\") pod \"c78d063e-7cd7-4b41-b148-1a7f9a3f9914\" (UID: \"c78d063e-7cd7-4b41-b148-1a7f9a3f9914\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.178876 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25905c52-4074-40d4-826f-ef89353eeaa6-operator-scripts\") pod \"25905c52-4074-40d4-826f-ef89353eeaa6\" (UID: \"25905c52-4074-40d4-826f-ef89353eeaa6\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.178912 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcg74\" (UniqueName: \"kubernetes.io/projected/fe445148-46c0-4e8c-844a-51a5ce323370-kube-api-access-jcg74\") pod \"fe445148-46c0-4e8c-844a-51a5ce323370\" (UID: \"fe445148-46c0-4e8c-844a-51a5ce323370\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.178944 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb5b6\" (UniqueName: \"kubernetes.io/projected/25905c52-4074-40d4-826f-ef89353eeaa6-kube-api-access-xb5b6\") pod \"25905c52-4074-40d4-826f-ef89353eeaa6\" (UID: \"25905c52-4074-40d4-826f-ef89353eeaa6\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.178969 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnvz2\" (UniqueName: \"kubernetes.io/projected/a8d81d51-f4b7-4dec-9548-982de19b4742-kube-api-access-vnvz2\") pod \"a8d81d51-f4b7-4dec-9548-982de19b4742\" (UID: \"a8d81d51-f4b7-4dec-9548-982de19b4742\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.179006 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/217ea569-e058-4f21-bbb7-d2f2648375eb-operator-scripts\") pod \"217ea569-e058-4f21-bbb7-d2f2648375eb\" (UID: \"217ea569-e058-4f21-bbb7-d2f2648375eb\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.179036 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe445148-46c0-4e8c-844a-51a5ce323370-operator-scripts\") pod \"fe445148-46c0-4e8c-844a-51a5ce323370\" (UID: \"fe445148-46c0-4e8c-844a-51a5ce323370\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.179058 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6rpc\" (UniqueName: \"kubernetes.io/projected/217ea569-e058-4f21-bbb7-d2f2648375eb-kube-api-access-k6rpc\") pod \"217ea569-e058-4f21-bbb7-d2f2648375eb\" (UID: \"217ea569-e058-4f21-bbb7-d2f2648375eb\") " Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.179579 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tctdk\" (UniqueName: \"kubernetes.io/projected/2039a569-0bc4-49a4-9e82-08964729dc7b-kube-api-access-tctdk\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.179602 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6ndf\" (UniqueName: \"kubernetes.io/projected/d5778eec-eb7e-4137-85bd-761ac78b9fd7-kube-api-access-r6ndf\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.179612 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5778eec-eb7e-4137-85bd-761ac78b9fd7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.181242 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c78d063e-7cd7-4b41-b148-1a7f9a3f9914" (UID: "c78d063e-7cd7-4b41-b148-1a7f9a3f9914"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.181568 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8d81d51-f4b7-4dec-9548-982de19b4742-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a8d81d51-f4b7-4dec-9548-982de19b4742" (UID: "a8d81d51-f4b7-4dec-9548-982de19b4742"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.182194 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25905c52-4074-40d4-826f-ef89353eeaa6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "25905c52-4074-40d4-826f-ef89353eeaa6" (UID: "25905c52-4074-40d4-826f-ef89353eeaa6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.183165 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe445148-46c0-4e8c-844a-51a5ce323370-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe445148-46c0-4e8c-844a-51a5ce323370" (UID: "fe445148-46c0-4e8c-844a-51a5ce323370"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.184646 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/217ea569-e058-4f21-bbb7-d2f2648375eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "217ea569-e058-4f21-bbb7-d2f2648375eb" (UID: "217ea569-e058-4f21-bbb7-d2f2648375eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.185094 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-kube-api-access-2dqjg" (OuterVolumeSpecName: "kube-api-access-2dqjg") pod "c78d063e-7cd7-4b41-b148-1a7f9a3f9914" (UID: "c78d063e-7cd7-4b41-b148-1a7f9a3f9914"). InnerVolumeSpecName "kube-api-access-2dqjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.186903 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/217ea569-e058-4f21-bbb7-d2f2648375eb-kube-api-access-k6rpc" (OuterVolumeSpecName: "kube-api-access-k6rpc") pod "217ea569-e058-4f21-bbb7-d2f2648375eb" (UID: "217ea569-e058-4f21-bbb7-d2f2648375eb"). InnerVolumeSpecName "kube-api-access-k6rpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.189645 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25905c52-4074-40d4-826f-ef89353eeaa6-kube-api-access-xb5b6" (OuterVolumeSpecName: "kube-api-access-xb5b6") pod "25905c52-4074-40d4-826f-ef89353eeaa6" (UID: "25905c52-4074-40d4-826f-ef89353eeaa6"). InnerVolumeSpecName "kube-api-access-xb5b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.189741 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe445148-46c0-4e8c-844a-51a5ce323370-kube-api-access-jcg74" (OuterVolumeSpecName: "kube-api-access-jcg74") pod "fe445148-46c0-4e8c-844a-51a5ce323370" (UID: "fe445148-46c0-4e8c-844a-51a5ce323370"). InnerVolumeSpecName "kube-api-access-jcg74". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.196309 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8d81d51-f4b7-4dec-9548-982de19b4742-kube-api-access-vnvz2" (OuterVolumeSpecName: "kube-api-access-vnvz2") pod "a8d81d51-f4b7-4dec-9548-982de19b4742" (UID: "a8d81d51-f4b7-4dec-9548-982de19b4742"). InnerVolumeSpecName "kube-api-access-vnvz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.207596 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-67cbt" event={"ID":"fe445148-46c0-4e8c-844a-51a5ce323370","Type":"ContainerDied","Data":"8955c30984ebf2c5ab888f047266d2a4993ef3ec32a630f0d78d6289e3b1e31b"} Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.207638 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8955c30984ebf2c5ab888f047266d2a4993ef3ec32a630f0d78d6289e3b1e31b" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.207687 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-67cbt" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.214829 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6tmmr-config-x86ml" event={"ID":"ff6d44c7-0792-4927-8214-a62a52211e92","Type":"ContainerStarted","Data":"d80a4243d9916b25ce2acd62cd615d65a0ec3c0a009774f96f8f4e7954f803ae"} Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.216717 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-nqj2r" event={"ID":"2039a569-0bc4-49a4-9e82-08964729dc7b","Type":"ContainerDied","Data":"ddb368090d84549e8613e6a8bf09662a248b3cdae6696eb51f4b8a9270abb3bd"} Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.216757 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddb368090d84549e8613e6a8bf09662a248b3cdae6696eb51f4b8a9270abb3bd" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.216836 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-nqj2r" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.219546 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-36cd-account-create-update-r5498" event={"ID":"a8d81d51-f4b7-4dec-9548-982de19b4742","Type":"ContainerDied","Data":"edc9e09b1a3536dad44773c79d12736ed4976a1f27a0aa500ba378c707d81315"} Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.219567 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-36cd-account-create-update-r5498" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.219578 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edc9e09b1a3536dad44773c79d12736ed4976a1f27a0aa500ba378c707d81315" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.221558 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e3f1f109-9754-4525-b5e8-dbf86ba52f2b","Type":"ContainerStarted","Data":"6b0b8c11d3f84c649ee2ff2c216706cc267f271274fe182ced5421a1cadf2672"} Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.231665 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-89e2-account-create-update-7656w" event={"ID":"25905c52-4074-40d4-826f-ef89353eeaa6","Type":"ContainerDied","Data":"1a6bf93e45faaf4db27d8e2c20d1f7fc553e5da9aa844b25ea2e3a9760a7bce6"} Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.231706 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a6bf93e45faaf4db27d8e2c20d1f7fc553e5da9aa844b25ea2e3a9760a7bce6" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.231771 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-89e2-account-create-update-7656w" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.237480 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.238075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-eefc-account-create-update-h8n6c" event={"ID":"c78d063e-7cd7-4b41-b148-1a7f9a3f9914","Type":"ContainerDied","Data":"fab07f6602d82c70ee914ff73e24097daec979f090b8fe1419a71597a96b49cd"} Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.238106 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fab07f6602d82c70ee914ff73e24097daec979f090b8fe1419a71597a96b49cd" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.238167 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-eefc-account-create-update-h8n6c" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.246140 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-7kcsc" event={"ID":"d5778eec-eb7e-4137-85bd-761ac78b9fd7","Type":"ContainerDied","Data":"414b6e4e860136d1c415be1caf745a8eda79544b985daf7b493e0eacf866bbda"} Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.246143 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-7kcsc" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.246190 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="414b6e4e860136d1c415be1caf745a8eda79544b985daf7b493e0eacf866bbda" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.251068 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0f99-account-create-update-fflhf" event={"ID":"217ea569-e058-4f21-bbb7-d2f2648375eb","Type":"ContainerDied","Data":"7430b538256ae44d6944b8b2a907a2e6c7bd0b62c3823fced29b96d85a5d5b4f"} Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.251091 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7430b538256ae44d6944b8b2a907a2e6c7bd0b62c3823fced29b96d85a5d5b4f" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.251127 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0f99-account-create-update-fflhf" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.253376 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ws9fr" event={"ID":"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb","Type":"ContainerStarted","Data":"3655d044c293425ea96154111c219b4b647a3c98ed5018f1350933db2f9bafe5"} Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.275259 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-ws9fr" podStartSLOduration=4.323650632 podStartE2EDuration="12.275244292s" podCreationTimestamp="2026-02-19 19:37:44 +0000 UTC" firstStartedPulling="2026-02-19 19:37:47.806475959 +0000 UTC m=+1167.418826273" lastFinishedPulling="2026-02-19 19:37:55.758069609 +0000 UTC m=+1175.370419933" observedRunningTime="2026-02-19 19:37:56.266497659 +0000 UTC m=+1175.878847983" watchObservedRunningTime="2026-02-19 19:37:56.275244292 +0000 UTC m=+1175.887594616" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.281775 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dqjg\" (UniqueName: \"kubernetes.io/projected/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-kube-api-access-2dqjg\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.281806 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25905c52-4074-40d4-826f-ef89353eeaa6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.281819 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcg74\" (UniqueName: \"kubernetes.io/projected/fe445148-46c0-4e8c-844a-51a5ce323370-kube-api-access-jcg74\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.281831 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb5b6\" (UniqueName: \"kubernetes.io/projected/25905c52-4074-40d4-826f-ef89353eeaa6-kube-api-access-xb5b6\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.281843 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnvz2\" (UniqueName: \"kubernetes.io/projected/a8d81d51-f4b7-4dec-9548-982de19b4742-kube-api-access-vnvz2\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.281854 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/217ea569-e058-4f21-bbb7-d2f2648375eb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.281865 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe445148-46c0-4e8c-844a-51a5ce323370-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.281877 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6rpc\" (UniqueName: \"kubernetes.io/projected/217ea569-e058-4f21-bbb7-d2f2648375eb-kube-api-access-k6rpc\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.281888 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c78d063e-7cd7-4b41-b148-1a7f9a3f9914-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:56 crc kubenswrapper[4722]: I0219 19:37:56.281900 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8d81d51-f4b7-4dec-9548-982de19b4742-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:57 crc kubenswrapper[4722]: I0219 19:37:57.264465 4722 generic.go:334] "Generic (PLEG): container finished" podID="619d59b3-6514-4648-9007-6e9ce3427c3a" containerID="6b62751b62c97e1ba880132d8b9f91b0968a628ff8eb98b71cf2b1fff30986bd" exitCode=0 Feb 19 19:37:57 crc kubenswrapper[4722]: I0219 19:37:57.264572 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8fd9q" event={"ID":"619d59b3-6514-4648-9007-6e9ce3427c3a","Type":"ContainerDied","Data":"6b62751b62c97e1ba880132d8b9f91b0968a628ff8eb98b71cf2b1fff30986bd"} Feb 19 19:37:57 crc kubenswrapper[4722]: I0219 19:37:57.266031 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"c3f50c281e50ed54f3395a6f901a3b6701619fc9571af90792278c0bfe6cf504"} Feb 19 19:37:57 crc kubenswrapper[4722]: I0219 19:37:57.267699 4722 generic.go:334] "Generic (PLEG): container finished" podID="ff6d44c7-0792-4927-8214-a62a52211e92" containerID="39d3bd74fcad2b2ba6a5d3be195f9ef849a5a1caabbd2723eb1f1b100ba3c28c" exitCode=0 Feb 19 19:37:57 crc kubenswrapper[4722]: I0219 19:37:57.268143 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6tmmr-config-x86ml" event={"ID":"ff6d44c7-0792-4927-8214-a62a52211e92","Type":"ContainerDied","Data":"39d3bd74fcad2b2ba6a5d3be195f9ef849a5a1caabbd2723eb1f1b100ba3c28c"} Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.282352 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"35ffd6ba2011e4907a541e24b62381502edf1432e54bb29ac20e152a37e39c1e"} Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.282841 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"45cb14460651143f32ebf26d6bbc03bb8d397ca69b720b2da78b224475a8ed78"} Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.545708 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-6tmmr" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.723672 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.730939 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.827459 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-scripts\") pod \"ff6d44c7-0792-4927-8214-a62a52211e92\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.827509 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-db-sync-config-data\") pod \"619d59b3-6514-4648-9007-6e9ce3427c3a\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.827546 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-config-data\") pod \"619d59b3-6514-4648-9007-6e9ce3427c3a\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.827624 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28qs4\" (UniqueName: \"kubernetes.io/projected/619d59b3-6514-4648-9007-6e9ce3427c3a-kube-api-access-28qs4\") pod \"619d59b3-6514-4648-9007-6e9ce3427c3a\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.827652 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-additional-scripts\") pod \"ff6d44c7-0792-4927-8214-a62a52211e92\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.827717 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run-ovn\") pod \"ff6d44c7-0792-4927-8214-a62a52211e92\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.827742 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-combined-ca-bundle\") pod \"619d59b3-6514-4648-9007-6e9ce3427c3a\" (UID: \"619d59b3-6514-4648-9007-6e9ce3427c3a\") " Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.827775 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk9qr\" (UniqueName: \"kubernetes.io/projected/ff6d44c7-0792-4927-8214-a62a52211e92-kube-api-access-mk9qr\") pod \"ff6d44c7-0792-4927-8214-a62a52211e92\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828123 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run\") pod \"ff6d44c7-0792-4927-8214-a62a52211e92\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828206 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-log-ovn\") pod \"ff6d44c7-0792-4927-8214-a62a52211e92\" (UID: \"ff6d44c7-0792-4927-8214-a62a52211e92\") " Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.827830 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ff6d44c7-0792-4927-8214-a62a52211e92" (UID: "ff6d44c7-0792-4927-8214-a62a52211e92"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828452 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ff6d44c7-0792-4927-8214-a62a52211e92" (UID: "ff6d44c7-0792-4927-8214-a62a52211e92"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828513 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ff6d44c7-0792-4927-8214-a62a52211e92" (UID: "ff6d44c7-0792-4927-8214-a62a52211e92"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828532 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run" (OuterVolumeSpecName: "var-run") pod "ff6d44c7-0792-4927-8214-a62a52211e92" (UID: "ff6d44c7-0792-4927-8214-a62a52211e92"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828619 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-scripts" (OuterVolumeSpecName: "scripts") pod "ff6d44c7-0792-4927-8214-a62a52211e92" (UID: "ff6d44c7-0792-4927-8214-a62a52211e92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828930 4722 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828957 4722 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828969 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828982 4722 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ff6d44c7-0792-4927-8214-a62a52211e92-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.828995 4722 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff6d44c7-0792-4927-8214-a62a52211e92-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.831473 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/619d59b3-6514-4648-9007-6e9ce3427c3a-kube-api-access-28qs4" (OuterVolumeSpecName: "kube-api-access-28qs4") pod "619d59b3-6514-4648-9007-6e9ce3427c3a" (UID: "619d59b3-6514-4648-9007-6e9ce3427c3a"). InnerVolumeSpecName "kube-api-access-28qs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.831558 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff6d44c7-0792-4927-8214-a62a52211e92-kube-api-access-mk9qr" (OuterVolumeSpecName: "kube-api-access-mk9qr") pod "ff6d44c7-0792-4927-8214-a62a52211e92" (UID: "ff6d44c7-0792-4927-8214-a62a52211e92"). InnerVolumeSpecName "kube-api-access-mk9qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.833902 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "619d59b3-6514-4648-9007-6e9ce3427c3a" (UID: "619d59b3-6514-4648-9007-6e9ce3427c3a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.858567 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "619d59b3-6514-4648-9007-6e9ce3427c3a" (UID: "619d59b3-6514-4648-9007-6e9ce3427c3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.879713 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-config-data" (OuterVolumeSpecName: "config-data") pod "619d59b3-6514-4648-9007-6e9ce3427c3a" (UID: "619d59b3-6514-4648-9007-6e9ce3427c3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.930866 4722 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.930900 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.930910 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28qs4\" (UniqueName: \"kubernetes.io/projected/619d59b3-6514-4648-9007-6e9ce3427c3a-kube-api-access-28qs4\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.930921 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/619d59b3-6514-4648-9007-6e9ce3427c3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:58 crc kubenswrapper[4722]: I0219 19:37:58.930930 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk9qr\" (UniqueName: \"kubernetes.io/projected/ff6d44c7-0792-4927-8214-a62a52211e92-kube-api-access-mk9qr\") on node \"crc\" DevicePath \"\"" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.298517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6tmmr-config-x86ml" event={"ID":"ff6d44c7-0792-4927-8214-a62a52211e92","Type":"ContainerDied","Data":"d80a4243d9916b25ce2acd62cd615d65a0ec3c0a009774f96f8f4e7954f803ae"} Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.298561 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d80a4243d9916b25ce2acd62cd615d65a0ec3c0a009774f96f8f4e7954f803ae" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.298580 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6tmmr-config-x86ml" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.299929 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e3f1f109-9754-4525-b5e8-dbf86ba52f2b","Type":"ContainerStarted","Data":"f7744f8998e67a032261d2c1555245665f3e18041cfa2083a87fc83fdee4de9e"} Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.314770 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-8fd9q" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.314796 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-8fd9q" event={"ID":"619d59b3-6514-4648-9007-6e9ce3427c3a","Type":"ContainerDied","Data":"f0c405b64fff456aecf84e0cb3dfbb788e3a93a4e01a434dac62f40edc004d0e"} Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.314837 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0c405b64fff456aecf84e0cb3dfbb788e3a93a4e01a434dac62f40edc004d0e" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.319978 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"0c36a68f8feba9e79e29839f692794ce920307708355ee8d3223d8832ac2ffdb"} Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.320035 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"9b22db2a298d3a844715acea22e05e515ceee761d9d6b2d67ff33b53edee69d9"} Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.664307 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-lj4f2"] Feb 19 19:37:59 crc kubenswrapper[4722]: E0219 19:37:59.664756 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c78d063e-7cd7-4b41-b148-1a7f9a3f9914" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.664784 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c78d063e-7cd7-4b41-b148-1a7f9a3f9914" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: E0219 19:37:59.664812 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="619d59b3-6514-4648-9007-6e9ce3427c3a" containerName="glance-db-sync" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.664822 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="619d59b3-6514-4648-9007-6e9ce3427c3a" containerName="glance-db-sync" Feb 19 19:37:59 crc kubenswrapper[4722]: E0219 19:37:59.664843 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2039a569-0bc4-49a4-9e82-08964729dc7b" containerName="mariadb-database-create" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.664852 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2039a569-0bc4-49a4-9e82-08964729dc7b" containerName="mariadb-database-create" Feb 19 19:37:59 crc kubenswrapper[4722]: E0219 19:37:59.664870 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="217ea569-e058-4f21-bbb7-d2f2648375eb" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.664879 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="217ea569-e058-4f21-bbb7-d2f2648375eb" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: E0219 19:37:59.664892 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8d81d51-f4b7-4dec-9548-982de19b4742" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.664901 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8d81d51-f4b7-4dec-9548-982de19b4742" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: E0219 19:37:59.664911 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe445148-46c0-4e8c-844a-51a5ce323370" containerName="mariadb-database-create" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.664919 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe445148-46c0-4e8c-844a-51a5ce323370" containerName="mariadb-database-create" Feb 19 19:37:59 crc kubenswrapper[4722]: E0219 19:37:59.664935 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25905c52-4074-40d4-826f-ef89353eeaa6" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.664943 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="25905c52-4074-40d4-826f-ef89353eeaa6" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: E0219 19:37:59.664955 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff6d44c7-0792-4927-8214-a62a52211e92" containerName="ovn-config" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.664963 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff6d44c7-0792-4927-8214-a62a52211e92" containerName="ovn-config" Feb 19 19:37:59 crc kubenswrapper[4722]: E0219 19:37:59.664977 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5778eec-eb7e-4137-85bd-761ac78b9fd7" containerName="mariadb-database-create" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.664985 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5778eec-eb7e-4137-85bd-761ac78b9fd7" containerName="mariadb-database-create" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.665202 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="619d59b3-6514-4648-9007-6e9ce3427c3a" containerName="glance-db-sync" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.665237 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2039a569-0bc4-49a4-9e82-08964729dc7b" containerName="mariadb-database-create" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.665256 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="217ea569-e058-4f21-bbb7-d2f2648375eb" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.665279 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe445148-46c0-4e8c-844a-51a5ce323370" containerName="mariadb-database-create" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.665292 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c78d063e-7cd7-4b41-b148-1a7f9a3f9914" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.665308 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="25905c52-4074-40d4-826f-ef89353eeaa6" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.665317 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5778eec-eb7e-4137-85bd-761ac78b9fd7" containerName="mariadb-database-create" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.665681 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8d81d51-f4b7-4dec-9548-982de19b4742" containerName="mariadb-account-create-update" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.665708 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff6d44c7-0792-4927-8214-a62a52211e92" containerName="ovn-config" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.673475 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.680920 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-lj4f2"] Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.743689 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.743841 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-config\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.743883 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m6p7\" (UniqueName: \"kubernetes.io/projected/49db2196-b62a-438c-974e-750f9c414846-kube-api-access-7m6p7\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.743912 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.744239 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.834211 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6tmmr-config-x86ml"] Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.847464 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.847535 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-config\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.847566 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m6p7\" (UniqueName: \"kubernetes.io/projected/49db2196-b62a-438c-974e-750f9c414846-kube-api-access-7m6p7\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.847593 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.847624 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.848022 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6tmmr-config-x86ml"] Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.848508 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.848830 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-config\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.848911 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.849032 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.869695 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m6p7\" (UniqueName: \"kubernetes.io/projected/49db2196-b62a-438c-974e-750f9c414846-kube-api-access-7m6p7\") pod \"dnsmasq-dns-5b946c75cc-lj4f2\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:37:59 crc kubenswrapper[4722]: I0219 19:37:59.987342 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:38:00 crc kubenswrapper[4722]: I0219 19:38:00.346436 4722 generic.go:334] "Generic (PLEG): container finished" podID="a4dc7071-7951-4302-96d9-ef7e4f7f2ceb" containerID="3655d044c293425ea96154111c219b4b647a3c98ed5018f1350933db2f9bafe5" exitCode=0 Feb 19 19:38:00 crc kubenswrapper[4722]: I0219 19:38:00.346512 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ws9fr" event={"ID":"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb","Type":"ContainerDied","Data":"3655d044c293425ea96154111c219b4b647a3c98ed5018f1350933db2f9bafe5"} Feb 19 19:38:00 crc kubenswrapper[4722]: I0219 19:38:00.491256 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-lj4f2"] Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.085455 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff6d44c7-0792-4927-8214-a62a52211e92" path="/var/lib/kubelet/pods/ff6d44c7-0792-4927-8214-a62a52211e92/volumes" Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.357300 4722 generic.go:334] "Generic (PLEG): container finished" podID="49db2196-b62a-438c-974e-750f9c414846" containerID="ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f" exitCode=0 Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.357370 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" event={"ID":"49db2196-b62a-438c-974e-750f9c414846","Type":"ContainerDied","Data":"ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f"} Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.357400 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" event={"ID":"49db2196-b62a-438c-974e-750f9c414846","Type":"ContainerStarted","Data":"1730f03840824b8a57c2cfd81cc1d16b40832add768ea20e40c6c547ebe37c30"} Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.366138 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"b4c73d2acfb3bab47fe9f86d5b73dc7d2ad5ce017d727c42bd3e92ae8d48103e"} Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.366199 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"191e07edde8d8015da65cc48db35a5f8d7a3b7c28981c7886317966771d73c53"} Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.366213 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"278845d3dbe2d485391d238613e57cc6ad2cb9a1470b98c2007878ed5f3a1b7e"} Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.653694 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.683713 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-config-data\") pod \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.683888 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zlfm\" (UniqueName: \"kubernetes.io/projected/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-kube-api-access-9zlfm\") pod \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.683932 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-combined-ca-bundle\") pod \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\" (UID: \"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb\") " Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.695905 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-kube-api-access-9zlfm" (OuterVolumeSpecName: "kube-api-access-9zlfm") pod "a4dc7071-7951-4302-96d9-ef7e4f7f2ceb" (UID: "a4dc7071-7951-4302-96d9-ef7e4f7f2ceb"). InnerVolumeSpecName "kube-api-access-9zlfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.709788 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4dc7071-7951-4302-96d9-ef7e4f7f2ceb" (UID: "a4dc7071-7951-4302-96d9-ef7e4f7f2ceb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.734784 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-config-data" (OuterVolumeSpecName: "config-data") pod "a4dc7071-7951-4302-96d9-ef7e4f7f2ceb" (UID: "a4dc7071-7951-4302-96d9-ef7e4f7f2ceb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.785933 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.785965 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zlfm\" (UniqueName: \"kubernetes.io/projected/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-kube-api-access-9zlfm\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.785976 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:01 crc kubenswrapper[4722]: I0219 19:38:01.950298 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.410136 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" event={"ID":"49db2196-b62a-438c-974e-750f9c414846","Type":"ContainerStarted","Data":"1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017"} Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.410338 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.421442 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"ddf4dc591f7c85df4c6746209688d14fe909b429746cbe7b920ee502df56cb84"} Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.427493 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ws9fr" event={"ID":"a4dc7071-7951-4302-96d9-ef7e4f7f2ceb","Type":"ContainerDied","Data":"bab4e0dcd47bed11b26a97a238fcb572193e857fe8e5670dfa59d566460783b1"} Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.427537 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bab4e0dcd47bed11b26a97a238fcb572193e857fe8e5670dfa59d566460783b1" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.427617 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ws9fr" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.437501 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" podStartSLOduration=3.437482934 podStartE2EDuration="3.437482934s" podCreationTimestamp="2026-02-19 19:37:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:02.432639444 +0000 UTC m=+1182.044989778" watchObservedRunningTime="2026-02-19 19:38:02.437482934 +0000 UTC m=+1182.049833248" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.589098 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-lj4f2"] Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.609660 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ctdw7"] Feb 19 19:38:02 crc kubenswrapper[4722]: E0219 19:38:02.610001 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4dc7071-7951-4302-96d9-ef7e4f7f2ceb" containerName="keystone-db-sync" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.610016 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4dc7071-7951-4302-96d9-ef7e4f7f2ceb" containerName="keystone-db-sync" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.613335 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4dc7071-7951-4302-96d9-ef7e4f7f2ceb" containerName="keystone-db-sync" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.614048 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.618627 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.618655 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.618655 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.618931 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qhj8b" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.619111 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.673237 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ctdw7"] Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.692431 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-784f69c749-bhnjb"] Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.694314 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.704828 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-fernet-keys\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.704874 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-scripts\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.704897 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-credential-keys\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.704918 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-combined-ca-bundle\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.704963 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xjh6\" (UniqueName: \"kubernetes.io/projected/09a108ba-bb88-4799-a230-638cabf304b0-kube-api-access-4xjh6\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.704994 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-config-data\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.718855 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-bhnjb"] Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.803528 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-nldcm"] Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.806811 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-config-data\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.806919 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkc7w\" (UniqueName: \"kubernetes.io/projected/6ec24fa4-f123-4210-83f8-915ca2a1a88e-kube-api-access-tkc7w\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.807002 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-fernet-keys\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.807026 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-scripts\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.807048 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-credential-keys\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.807070 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-combined-ca-bundle\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.807140 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-dns-svc\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.807207 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-config\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.807245 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.807270 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.807288 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xjh6\" (UniqueName: \"kubernetes.io/projected/09a108ba-bb88-4799-a230-638cabf304b0-kube-api-access-4xjh6\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.807808 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.810846 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-4h658" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.817529 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-combined-ca-bundle\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.817673 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-credential-keys\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.817920 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.818117 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.818397 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-config-data\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.821527 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-scripts\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.827289 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-fernet-keys\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.849026 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nldcm"] Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.868479 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xjh6\" (UniqueName: \"kubernetes.io/projected/09a108ba-bb88-4799-a230-638cabf304b0-kube-api-access-4xjh6\") pod \"keystone-bootstrap-ctdw7\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909229 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdjl8\" (UniqueName: \"kubernetes.io/projected/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-kube-api-access-cdjl8\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909280 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-combined-ca-bundle\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909324 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-etc-machine-id\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909394 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-scripts\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909467 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-dns-svc\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909492 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-config\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909521 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909550 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-db-sync-config-data\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909578 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909607 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-config-data\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.909692 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkc7w\" (UniqueName: \"kubernetes.io/projected/6ec24fa4-f123-4210-83f8-915ca2a1a88e-kube-api-access-tkc7w\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.910982 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-dns-svc\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.911719 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-config\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.913862 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-sb\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.925028 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-nb\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.956404 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.958374 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-7b98l"] Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.959624 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.962739 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.975410 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.975594 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4wknf" Feb 19 19:38:02 crc kubenswrapper[4722]: I0219 19:38:02.995040 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkc7w\" (UniqueName: \"kubernetes.io/projected/6ec24fa4-f123-4210-83f8-915ca2a1a88e-kube-api-access-tkc7w\") pod \"dnsmasq-dns-784f69c749-bhnjb\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.015510 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-db-sync-config-data\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.015550 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-config-data\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.015623 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdjl8\" (UniqueName: \"kubernetes.io/projected/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-kube-api-access-cdjl8\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.015641 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-combined-ca-bundle\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.015666 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-etc-machine-id\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.015704 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-scripts\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.020019 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-db-sync-config-data\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.020206 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-etc-machine-id\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.024336 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-lnf5k"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.035486 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.035769 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-config-data\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.037446 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-combined-ca-bundle\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.037691 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.042240 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.042509 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tj2ww" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.060088 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-scripts\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.092702 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdjl8\" (UniqueName: \"kubernetes.io/projected/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-kube-api-access-cdjl8\") pod \"cinder-db-sync-nldcm\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.117516 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n4nt\" (UniqueName: \"kubernetes.io/projected/eab1ce59-2254-419a-bab0-cf5e87888634-kube-api-access-4n4nt\") pod \"neutron-db-sync-7b98l\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.117637 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-config\") pod \"neutron-db-sync-7b98l\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.117673 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-combined-ca-bundle\") pod \"neutron-db-sync-7b98l\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.144211 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7b98l"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.144244 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-lnf5k"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.144256 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-xdgs2"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.146317 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-xdgs2"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.146394 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.149674 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.149922 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.149885 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.150139 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-bnkq4" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.152444 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.162824 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.162965 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.166972 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.169375 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.171876 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-bhnjb"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.189656 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-zrwzj"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.191311 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.194444 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7xcck" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.194653 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.194884 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.197317 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nldcm" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.221575 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-db-sync-config-data\") pod \"barbican-db-sync-lnf5k\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.222314 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-config\") pod \"neutron-db-sync-7b98l\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.222350 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-combined-ca-bundle\") pod \"barbican-db-sync-lnf5k\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.222395 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvk66\" (UniqueName: \"kubernetes.io/projected/9c2453a9-4c81-4256-b52d-edb69c12c7d7-kube-api-access-dvk66\") pod \"barbican-db-sync-lnf5k\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.222481 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-combined-ca-bundle\") pod \"neutron-db-sync-7b98l\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.222646 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n4nt\" (UniqueName: \"kubernetes.io/projected/eab1ce59-2254-419a-bab0-cf5e87888634-kube-api-access-4n4nt\") pod \"neutron-db-sync-7b98l\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.225131 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zrwzj"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.228836 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-config\") pod \"neutron-db-sync-7b98l\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.238418 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-combined-ca-bundle\") pod \"neutron-db-sync-7b98l\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.262879 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-8xk8b"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.264414 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.267308 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n4nt\" (UniqueName: \"kubernetes.io/projected/eab1ce59-2254-419a-bab0-cf5e87888634-kube-api-access-4n4nt\") pod \"neutron-db-sync-7b98l\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.276019 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-8xk8b"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.327410 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-log-httpd\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.327621 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-combined-ca-bundle\") pod \"barbican-db-sync-lnf5k\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.327716 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvk66\" (UniqueName: \"kubernetes.io/projected/9c2453a9-4c81-4256-b52d-edb69c12c7d7-kube-api-access-dvk66\") pod \"barbican-db-sync-lnf5k\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.327780 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-scripts\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.328517 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.328657 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-run-httpd\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.329557 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-config-data\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.329662 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmgm9\" (UniqueName: \"kubernetes.io/projected/41216a8d-32f8-4ec6-ab65-5474453cad03-kube-api-access-kmgm9\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.329748 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-scripts\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.329861 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-combined-ca-bundle\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.329948 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.330077 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-config-data\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.330165 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-certs\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.330235 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-db-sync-config-data\") pod \"barbican-db-sync-lnf5k\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.330302 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41216a8d-32f8-4ec6-ab65-5474453cad03-logs\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.330349 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xsfs\" (UniqueName: \"kubernetes.io/projected/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-kube-api-access-6xsfs\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.330377 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-combined-ca-bundle\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.330410 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-scripts\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.330428 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-config-data\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.330457 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7zht\" (UniqueName: \"kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-kube-api-access-l7zht\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.332902 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-combined-ca-bundle\") pod \"barbican-db-sync-lnf5k\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.337597 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-db-sync-config-data\") pod \"barbican-db-sync-lnf5k\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.356377 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvk66\" (UniqueName: \"kubernetes.io/projected/9c2453a9-4c81-4256-b52d-edb69c12c7d7-kube-api-access-dvk66\") pod \"barbican-db-sync-lnf5k\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.433321 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xsfs\" (UniqueName: \"kubernetes.io/projected/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-kube-api-access-6xsfs\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.433694 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-combined-ca-bundle\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.433729 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-scripts\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.433752 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-config-data\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.433781 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-dns-svc\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.433808 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7zht\" (UniqueName: \"kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-kube-api-access-l7zht\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.433830 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.433853 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-config\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.433921 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-log-httpd\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.433963 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7qpx\" (UniqueName: \"kubernetes.io/projected/0987fde3-8329-4305-bd1c-efa7cf79306b-kube-api-access-t7qpx\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.434003 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-scripts\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.434031 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.434053 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-run-httpd\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.434091 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-config-data\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.434115 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmgm9\" (UniqueName: \"kubernetes.io/projected/41216a8d-32f8-4ec6-ab65-5474453cad03-kube-api-access-kmgm9\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.434141 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-scripts\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.435996 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-combined-ca-bundle\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.436056 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.436091 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-config-data\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.436130 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.436183 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-certs\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.436270 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41216a8d-32f8-4ec6-ab65-5474453cad03-logs\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.436608 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41216a8d-32f8-4ec6-ab65-5474453cad03-logs\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.436850 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-run-httpd\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.436936 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-log-httpd\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.438511 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-config-data\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.446492 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-config-data\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.446614 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-scripts\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.446957 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.446644 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-scripts\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.451351 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.451877 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-combined-ca-bundle\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.457607 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7zht\" (UniqueName: \"kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-kube-api-access-l7zht\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.457807 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-combined-ca-bundle\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.458050 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmgm9\" (UniqueName: \"kubernetes.io/projected/41216a8d-32f8-4ec6-ab65-5474453cad03-kube-api-access-kmgm9\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.465129 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xsfs\" (UniqueName: \"kubernetes.io/projected/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-kube-api-access-6xsfs\") pod \"ceilometer-0\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.465958 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-certs\") pod \"cloudkitty-db-sync-xdgs2\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.466330 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-scripts\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.467453 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-config-data\") pod \"placement-db-sync-zrwzj\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.494991 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.504575 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.521414 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"73bbded6588f4d10f80a3585928a42958ae2d126263b408adce35c4d3a24ec4b"} Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.541331 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.542293 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-dns-svc\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.542333 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.542363 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-config\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.542440 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7qpx\" (UniqueName: \"kubernetes.io/projected/0987fde3-8329-4305-bd1c-efa7cf79306b-kube-api-access-t7qpx\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.542519 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.543720 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-sb\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.544649 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-nb\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.545787 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-dns-svc\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.549995 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-config\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.562709 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7qpx\" (UniqueName: \"kubernetes.io/projected/0987fde3-8329-4305-bd1c-efa7cf79306b-kube-api-access-t7qpx\") pod \"dnsmasq-dns-f84976bdf-8xk8b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.570965 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.673922 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.708718 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.729867 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-bhnjb"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.774979 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.777364 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.781372 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.781754 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9s8kl" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.781622 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.781709 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.792899 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.861178 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.881613 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.887050 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.887407 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.896621 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.908941 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ctdw7"] Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.949137 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-logs\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.949216 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.949263 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-config-data\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.949295 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.949339 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.949368 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vsfq\" (UniqueName: \"kubernetes.io/projected/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-kube-api-access-7vsfq\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.949400 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-scripts\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:03 crc kubenswrapper[4722]: I0219 19:38:03.949422 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.052780 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.052839 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.052868 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-logs\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.052888 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vsfq\" (UniqueName: \"kubernetes.io/projected/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-kube-api-access-7vsfq\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.052925 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.052942 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-scripts\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.052958 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.052974 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.053000 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.053014 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8lj4\" (UniqueName: \"kubernetes.io/projected/b5e004a3-da53-4fbb-a396-52e33d205e2e-kube-api-access-l8lj4\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.053033 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.053061 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-logs\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.053077 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.053117 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.053142 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.053199 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-config-data\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.058833 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-logs\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.059429 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.063058 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.071822 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.079788 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-scripts\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.080484 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-config-data\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.100727 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vsfq\" (UniqueName: \"kubernetes.io/projected/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-kube-api-access-7vsfq\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.148647 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.148685 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2019fecbddc337ddf53783637eb0008bc901e49a55294deb1e2d06fbb77c3ae3/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.158190 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.158310 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-logs\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.158349 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.158370 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.158399 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.158414 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8lj4\" (UniqueName: \"kubernetes.io/projected/b5e004a3-da53-4fbb-a396-52e33d205e2e-kube-api-access-l8lj4\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.158432 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.158460 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.164033 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.164356 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-logs\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.170072 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nldcm"] Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.199848 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.213237 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.246999 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.247074 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b323df4ccd136fd865256cd83fe693e56c32fbc8a05d96b41caf6babb703da86/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.282635 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8lj4\" (UniqueName: \"kubernetes.io/projected/b5e004a3-da53-4fbb-a396-52e33d205e2e-kube-api-access-l8lj4\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.282818 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.342292 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7b98l"] Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.343074 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.498508 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.525557 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.540508 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7b98l" event={"ID":"eab1ce59-2254-419a-bab0-cf5e87888634","Type":"ContainerStarted","Data":"faf928cc455dc359a2347459ccaaf8574498adb59570eba6f898fdc7c69b0cd6"} Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.543376 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"b77e3cc63155037458b7636e61f90253b9f0f19e1fb29907d523fc36aff23280"} Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.544032 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ctdw7" event={"ID":"09a108ba-bb88-4799-a230-638cabf304b0","Type":"ContainerStarted","Data":"9d69e23e43e8ab2aa747e1b227270b4fab24359a7aa862c4ab858b12cf3f9985"} Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.547809 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-bhnjb" event={"ID":"6ec24fa4-f123-4210-83f8-915ca2a1a88e","Type":"ContainerStarted","Data":"f8fb40f06ecabee91859e66632484f9e76d88441037ec83802ca97f9f4dee4d3"} Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.564464 4722 generic.go:334] "Generic (PLEG): container finished" podID="e3f1f109-9754-4525-b5e8-dbf86ba52f2b" containerID="f7744f8998e67a032261d2c1555245665f3e18041cfa2083a87fc83fdee4de9e" exitCode=0 Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.564565 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e3f1f109-9754-4525-b5e8-dbf86ba52f2b","Type":"ContainerDied","Data":"f7744f8998e67a032261d2c1555245665f3e18041cfa2083a87fc83fdee4de9e"} Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.565910 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" podUID="49db2196-b62a-438c-974e-750f9c414846" containerName="dnsmasq-dns" containerID="cri-o://1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017" gracePeriod=10 Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.566132 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nldcm" event={"ID":"512a4c5e-3ea6-42a8-9f83-8c0e5375891d","Type":"ContainerStarted","Data":"65597a01a3e59b230c7526b664301c7f8fdd9e898558a558f3adbb4bcd59ec0f"} Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.580530 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-lnf5k"] Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.634856 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.724119 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.761167 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.777287 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-xdgs2"] Feb 19 19:38:04 crc kubenswrapper[4722]: I0219 19:38:04.919478 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zrwzj"] Feb 19 19:38:04 crc kubenswrapper[4722]: W0219 19:38:04.929854 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41216a8d_32f8_4ec6_ab65_5474453cad03.slice/crio-459f00c8be5fc495863103c5ddaf3ba201cb2953a4237a8efa971c34be8b5a94 WatchSource:0}: Error finding container 459f00c8be5fc495863103c5ddaf3ba201cb2953a4237a8efa971c34be8b5a94: Status 404 returned error can't find the container with id 459f00c8be5fc495863103c5ddaf3ba201cb2953a4237a8efa971c34be8b5a94 Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.053287 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-8xk8b"] Feb 19 19:38:05 crc kubenswrapper[4722]: W0219 19:38:05.144717 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0987fde3_8329_4305_bd1c_efa7cf79306b.slice/crio-dfd27acc06f5b6599cb43200558affa79dcb057ac010f2b3d993579ba443e434 WatchSource:0}: Error finding container dfd27acc06f5b6599cb43200558affa79dcb057ac010f2b3d993579ba443e434: Status 404 returned error can't find the container with id dfd27acc06f5b6599cb43200558affa79dcb057ac010f2b3d993579ba443e434 Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.375047 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.524043 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-dns-svc\") pod \"49db2196-b62a-438c-974e-750f9c414846\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.524101 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m6p7\" (UniqueName: \"kubernetes.io/projected/49db2196-b62a-438c-974e-750f9c414846-kube-api-access-7m6p7\") pod \"49db2196-b62a-438c-974e-750f9c414846\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.524280 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-sb\") pod \"49db2196-b62a-438c-974e-750f9c414846\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.524332 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-nb\") pod \"49db2196-b62a-438c-974e-750f9c414846\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.524365 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-config\") pod \"49db2196-b62a-438c-974e-750f9c414846\" (UID: \"49db2196-b62a-438c-974e-750f9c414846\") " Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.549018 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49db2196-b62a-438c-974e-750f9c414846-kube-api-access-7m6p7" (OuterVolumeSpecName: "kube-api-access-7m6p7") pod "49db2196-b62a-438c-974e-750f9c414846" (UID: "49db2196-b62a-438c-974e-750f9c414846"). InnerVolumeSpecName "kube-api-access-7m6p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.592676 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16","Type":"ContainerStarted","Data":"1faa29ce27320ab22dc6db2828db88d540021f7a0832148de51b439f8684b1f0"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.607575 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e3f1f109-9754-4525-b5e8-dbf86ba52f2b","Type":"ContainerStarted","Data":"0086b29404894dff5db29e861f6c264796c21706e5ca7150f1456fe9a82acdd8"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.622004 4722 generic.go:334] "Generic (PLEG): container finished" podID="49db2196-b62a-438c-974e-750f9c414846" containerID="1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017" exitCode=0 Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.622035 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.622182 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" event={"ID":"49db2196-b62a-438c-974e-750f9c414846","Type":"ContainerDied","Data":"1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.622245 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-lj4f2" event={"ID":"49db2196-b62a-438c-974e-750f9c414846","Type":"ContainerDied","Data":"1730f03840824b8a57c2cfd81cc1d16b40832add768ea20e40c6c547ebe37c30"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.622263 4722 scope.go:117] "RemoveContainer" containerID="1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.624935 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "49db2196-b62a-438c-974e-750f9c414846" (UID: "49db2196-b62a-438c-974e-750f9c414846"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.629868 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.629910 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m6p7\" (UniqueName: \"kubernetes.io/projected/49db2196-b62a-438c-974e-750f9c414846-kube-api-access-7m6p7\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.657366 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.659421 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "49db2196-b62a-438c-974e-750f9c414846" (UID: "49db2196-b62a-438c-974e-750f9c414846"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.671258 4722 generic.go:334] "Generic (PLEG): container finished" podID="6ec24fa4-f123-4210-83f8-915ca2a1a88e" containerID="28dac43a68d14b0866389b29dfa45b324fd9b1b009f51a0ef8654fd374d27cfe" exitCode=0 Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.671536 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-bhnjb" event={"ID":"6ec24fa4-f123-4210-83f8-915ca2a1a88e","Type":"ContainerDied","Data":"28dac43a68d14b0866389b29dfa45b324fd9b1b009f51a0ef8654fd374d27cfe"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.674425 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-config" (OuterVolumeSpecName: "config") pod "49db2196-b62a-438c-974e-750f9c414846" (UID: "49db2196-b62a-438c-974e-750f9c414846"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.705551 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "49db2196-b62a-438c-974e-750f9c414846" (UID: "49db2196-b62a-438c-974e-750f9c414846"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.721602 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ctdw7" event={"ID":"09a108ba-bb88-4799-a230-638cabf304b0","Type":"ContainerStarted","Data":"8fb5c1c0ec360aa5fc271ce7683847ce4ebe5cbb2a0793d19d34b7cc7bc220b8"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.754096 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.754134 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.754147 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49db2196-b62a-438c-974e-750f9c414846-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.780285 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lnf5k" event={"ID":"9c2453a9-4c81-4256-b52d-edb69c12c7d7","Type":"ContainerStarted","Data":"c55c99500a8dc3a393a869149de80e388347c4c52dbc3f1981dc5cba2b917f9a"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.796649 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.808473 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.826077 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ctdw7" podStartSLOduration=3.826052082 podStartE2EDuration="3.826052082s" podCreationTimestamp="2026-02-19 19:38:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:05.780111703 +0000 UTC m=+1185.392462027" watchObservedRunningTime="2026-02-19 19:38:05.826052082 +0000 UTC m=+1185.438402406" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.836611 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7b98l" event={"ID":"eab1ce59-2254-419a-bab0-cf5e87888634","Type":"ContainerStarted","Data":"a1c03548ff56ab3102ffaa64e0990092747adeddc1030d3c048e1f3f59e0095b"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.875416 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.887980 4722 scope.go:117] "RemoveContainer" containerID="ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.889197 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-7b98l" podStartSLOduration=3.889185107 podStartE2EDuration="3.889185107s" podCreationTimestamp="2026-02-19 19:38:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:05.869864215 +0000 UTC m=+1185.482214539" watchObservedRunningTime="2026-02-19 19:38:05.889185107 +0000 UTC m=+1185.501535431" Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.942839 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"d44927022c5675b593b112d3c323999589b2f15cd28af0dc8b1a34c98596e11d"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.943193 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"406e254384848462bf0f562451ea2560c26093d401090dc9aacb9821506ef209"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.953403 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.956207 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-xdgs2" event={"ID":"fb399ce1-7269-4d99-9140-0d1d33a6fd6a","Type":"ContainerStarted","Data":"b485d2ccfdc9766193d0fa763ea0b9af82b812effcaae62a566a8b1ce25316b5"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.972411 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" event={"ID":"0987fde3-8329-4305-bd1c-efa7cf79306b","Type":"ContainerStarted","Data":"981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.972452 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" event={"ID":"0987fde3-8329-4305-bd1c-efa7cf79306b","Type":"ContainerStarted","Data":"dfd27acc06f5b6599cb43200558affa79dcb057ac010f2b3d993579ba443e434"} Feb 19 19:38:05 crc kubenswrapper[4722]: I0219 19:38:05.975188 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zrwzj" event={"ID":"41216a8d-32f8-4ec6-ab65-5474453cad03","Type":"ContainerStarted","Data":"459f00c8be5fc495863103c5ddaf3ba201cb2953a4237a8efa971c34be8b5a94"} Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.008942 4722 scope.go:117] "RemoveContainer" containerID="1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017" Feb 19 19:38:06 crc kubenswrapper[4722]: E0219 19:38:06.021688 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017\": container with ID starting with 1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017 not found: ID does not exist" containerID="1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.021734 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017"} err="failed to get container status \"1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017\": rpc error: code = NotFound desc = could not find container \"1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017\": container with ID starting with 1afa2ad9509eb9233f6766563ef1bbb652b570a58ee830bfeb7c59c81c109017 not found: ID does not exist" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.021759 4722 scope.go:117] "RemoveContainer" containerID="ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f" Feb 19 19:38:06 crc kubenswrapper[4722]: E0219 19:38:06.039439 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f\": container with ID starting with ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f not found: ID does not exist" containerID="ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.039481 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f"} err="failed to get container status \"ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f\": rpc error: code = NotFound desc = could not find container \"ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f\": container with ID starting with ed1cc93974143b7843bcf72945b7788fa7890dcc76f5d781e411251a5d3f109f not found: ID does not exist" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.202234 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-lj4f2"] Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.219053 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-lj4f2"] Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.422656 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.497968 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-dns-svc\") pod \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.498071 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-nb\") pod \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.498107 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-sb\") pod \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.498313 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-config\") pod \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.498423 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkc7w\" (UniqueName: \"kubernetes.io/projected/6ec24fa4-f123-4210-83f8-915ca2a1a88e-kube-api-access-tkc7w\") pod \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\" (UID: \"6ec24fa4-f123-4210-83f8-915ca2a1a88e\") " Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.516324 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ec24fa4-f123-4210-83f8-915ca2a1a88e-kube-api-access-tkc7w" (OuterVolumeSpecName: "kube-api-access-tkc7w") pod "6ec24fa4-f123-4210-83f8-915ca2a1a88e" (UID: "6ec24fa4-f123-4210-83f8-915ca2a1a88e"). InnerVolumeSpecName "kube-api-access-tkc7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.551941 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6ec24fa4-f123-4210-83f8-915ca2a1a88e" (UID: "6ec24fa4-f123-4210-83f8-915ca2a1a88e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.568682 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6ec24fa4-f123-4210-83f8-915ca2a1a88e" (UID: "6ec24fa4-f123-4210-83f8-915ca2a1a88e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.579053 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6ec24fa4-f123-4210-83f8-915ca2a1a88e" (UID: "6ec24fa4-f123-4210-83f8-915ca2a1a88e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.580209 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-config" (OuterVolumeSpecName: "config") pod "6ec24fa4-f123-4210-83f8-915ca2a1a88e" (UID: "6ec24fa4-f123-4210-83f8-915ca2a1a88e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.603417 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkc7w\" (UniqueName: \"kubernetes.io/projected/6ec24fa4-f123-4210-83f8-915ca2a1a88e-kube-api-access-tkc7w\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.603448 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.603459 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.603472 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:06 crc kubenswrapper[4722]: I0219 19:38:06.603485 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ec24fa4-f123-4210-83f8-915ca2a1a88e-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.034977 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"dbaeb2907a7d4f1a8075471a7f624d26c20a73faf94a2ad21e1504b734de3c4e"} Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.035039 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"8a1bc9bf3676a2530619b43c1e910cbeb74b46fe2d4d77c2a0f01940d7d90b78"} Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.035056 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"98dc74a5-9538-49e4-9dd0-eb2735f18d41","Type":"ContainerStarted","Data":"9b92ed5967063c1aee12781aad0355a1be7a5579b64ae61784a503f110be9780"} Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.038049 4722 generic.go:334] "Generic (PLEG): container finished" podID="0987fde3-8329-4305-bd1c-efa7cf79306b" containerID="981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9" exitCode=0 Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.038101 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" event={"ID":"0987fde3-8329-4305-bd1c-efa7cf79306b","Type":"ContainerDied","Data":"981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9"} Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.038126 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" event={"ID":"0987fde3-8329-4305-bd1c-efa7cf79306b","Type":"ContainerStarted","Data":"abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba"} Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.038923 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.068274 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784f69c749-bhnjb" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.068252 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784f69c749-bhnjb" event={"ID":"6ec24fa4-f123-4210-83f8-915ca2a1a88e","Type":"ContainerDied","Data":"f8fb40f06ecabee91859e66632484f9e76d88441037ec83802ca97f9f4dee4d3"} Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.069273 4722 scope.go:117] "RemoveContainer" containerID="28dac43a68d14b0866389b29dfa45b324fd9b1b009f51a0ef8654fd374d27cfe" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.101389 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=41.491530975 podStartE2EDuration="48.101362235s" podCreationTimestamp="2026-02-19 19:37:19 +0000 UTC" firstStartedPulling="2026-02-19 19:37:56.251030728 +0000 UTC m=+1175.863381052" lastFinishedPulling="2026-02-19 19:38:02.860861988 +0000 UTC m=+1182.473212312" observedRunningTime="2026-02-19 19:38:07.079190154 +0000 UTC m=+1186.691540488" watchObservedRunningTime="2026-02-19 19:38:07.101362235 +0000 UTC m=+1186.713712569" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.126023 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" podStartSLOduration=4.125999862 podStartE2EDuration="4.125999862s" podCreationTimestamp="2026-02-19 19:38:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:07.121376717 +0000 UTC m=+1186.733727041" watchObservedRunningTime="2026-02-19 19:38:07.125999862 +0000 UTC m=+1186.738350186" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.176087 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49db2196-b62a-438c-974e-750f9c414846" path="/var/lib/kubelet/pods/49db2196-b62a-438c-974e-750f9c414846/volumes" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.177073 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5e004a3-da53-4fbb-a396-52e33d205e2e","Type":"ContainerStarted","Data":"74d2769ab4752d1feeca0ef2edcd424d998dd7f01e76629b5fdbd1920be6013a"} Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.177108 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e","Type":"ContainerStarted","Data":"f05ea5f4636b64bd38579945e16464ca01ab6cde2bcc3d0ac468f593dd5c2f4e"} Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.221897 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-bhnjb"] Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.245912 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-784f69c749-bhnjb"] Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.395611 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-8xk8b"] Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.435191 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-587r4"] Feb 19 19:38:07 crc kubenswrapper[4722]: E0219 19:38:07.435682 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49db2196-b62a-438c-974e-750f9c414846" containerName="dnsmasq-dns" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.435704 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="49db2196-b62a-438c-974e-750f9c414846" containerName="dnsmasq-dns" Feb 19 19:38:07 crc kubenswrapper[4722]: E0219 19:38:07.435716 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec24fa4-f123-4210-83f8-915ca2a1a88e" containerName="init" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.435724 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec24fa4-f123-4210-83f8-915ca2a1a88e" containerName="init" Feb 19 19:38:07 crc kubenswrapper[4722]: E0219 19:38:07.435748 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49db2196-b62a-438c-974e-750f9c414846" containerName="init" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.435756 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="49db2196-b62a-438c-974e-750f9c414846" containerName="init" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.435970 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec24fa4-f123-4210-83f8-915ca2a1a88e" containerName="init" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.436009 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="49db2196-b62a-438c-974e-750f9c414846" containerName="dnsmasq-dns" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.437439 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.444837 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.449735 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-587r4"] Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.529429 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.529476 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.529540 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.529639 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65hgp\" (UniqueName: \"kubernetes.io/projected/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-kube-api-access-65hgp\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.529669 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-config\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.529716 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.632185 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65hgp\" (UniqueName: \"kubernetes.io/projected/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-kube-api-access-65hgp\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.632300 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-config\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.632355 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.632421 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.632445 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.632553 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.633674 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.633943 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.634185 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-config\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.635189 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.635902 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.649939 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65hgp\" (UniqueName: \"kubernetes.io/projected/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-kube-api-access-65hgp\") pod \"dnsmasq-dns-785d8bcb8c-587r4\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:07 crc kubenswrapper[4722]: I0219 19:38:07.823287 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:08 crc kubenswrapper[4722]: I0219 19:38:08.148042 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5e004a3-da53-4fbb-a396-52e33d205e2e","Type":"ContainerStarted","Data":"26f23b94ceca02366d6ad7b5b51d95589832118420b7f024d6cc30a861e72a4d"} Feb 19 19:38:08 crc kubenswrapper[4722]: I0219 19:38:08.153489 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e","Type":"ContainerStarted","Data":"0cb57e5ce54d4ebdcfc5834077ee30754bec175aed42d1c77310f409f5adb33c"} Feb 19 19:38:08 crc kubenswrapper[4722]: I0219 19:38:08.163967 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e3f1f109-9754-4525-b5e8-dbf86ba52f2b","Type":"ContainerStarted","Data":"8dc16b1e074fd3091b107bc3bda1e24e49c55a05b0a2a77d6492836d8b81cf1e"} Feb 19 19:38:08 crc kubenswrapper[4722]: I0219 19:38:08.350274 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-587r4"] Feb 19 19:38:08 crc kubenswrapper[4722]: W0219 19:38:08.396940 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0396bebf_310d_43c3_a0f5_e8cddf9c3cb0.slice/crio-ccc22dda92d98641022846e698ef973d9b21c55e7af354f095475756126633bf WatchSource:0}: Error finding container ccc22dda92d98641022846e698ef973d9b21c55e7af354f095475756126633bf: Status 404 returned error can't find the container with id ccc22dda92d98641022846e698ef973d9b21c55e7af354f095475756126633bf Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.146526 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ec24fa4-f123-4210-83f8-915ca2a1a88e" path="/var/lib/kubelet/pods/6ec24fa4-f123-4210-83f8-915ca2a1a88e/volumes" Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.192431 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5e004a3-da53-4fbb-a396-52e33d205e2e","Type":"ContainerStarted","Data":"8abd067186838cbd1efbd6d007696dcd996ec432757392f167f24e47f4f57171"} Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.192634 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b5e004a3-da53-4fbb-a396-52e33d205e2e" containerName="glance-log" containerID="cri-o://26f23b94ceca02366d6ad7b5b51d95589832118420b7f024d6cc30a861e72a4d" gracePeriod=30 Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.192721 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="b5e004a3-da53-4fbb-a396-52e33d205e2e" containerName="glance-httpd" containerID="cri-o://8abd067186838cbd1efbd6d007696dcd996ec432757392f167f24e47f4f57171" gracePeriod=30 Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.208314 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e","Type":"ContainerStarted","Data":"17e885ee19d45823afa31ec6273541ee2f4327ad3250b341ab5883d6c0baed3b"} Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.208451 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" containerName="glance-log" containerID="cri-o://0cb57e5ce54d4ebdcfc5834077ee30754bec175aed42d1c77310f409f5adb33c" gracePeriod=30 Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.208648 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" containerName="glance-httpd" containerID="cri-o://17e885ee19d45823afa31ec6273541ee2f4327ad3250b341ab5883d6c0baed3b" gracePeriod=30 Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.229654 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.229611387 podStartE2EDuration="7.229611387s" podCreationTimestamp="2026-02-19 19:38:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:09.216777308 +0000 UTC m=+1188.829127632" watchObservedRunningTime="2026-02-19 19:38:09.229611387 +0000 UTC m=+1188.841961711" Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.250060 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.250033482 podStartE2EDuration="7.250033482s" podCreationTimestamp="2026-02-19 19:38:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:09.242060854 +0000 UTC m=+1188.854411178" watchObservedRunningTime="2026-02-19 19:38:09.250033482 +0000 UTC m=+1188.862383806" Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.262934 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e3f1f109-9754-4525-b5e8-dbf86ba52f2b","Type":"ContainerStarted","Data":"3745c2c1fca541ddbd93814a2b3c6f93a82b174021d51f8029526bbe280b334b"} Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.266851 4722 generic.go:334] "Generic (PLEG): container finished" podID="0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" containerID="e6f983685272a18f6384c33405c42fd7cac9d9c7919a092034ad166f31cb8a76" exitCode=0 Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.267104 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" event={"ID":"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0","Type":"ContainerDied","Data":"e6f983685272a18f6384c33405c42fd7cac9d9c7919a092034ad166f31cb8a76"} Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.267907 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" event={"ID":"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0","Type":"ContainerStarted","Data":"ccc22dda92d98641022846e698ef973d9b21c55e7af354f095475756126633bf"} Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.268373 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" podUID="0987fde3-8329-4305-bd1c-efa7cf79306b" containerName="dnsmasq-dns" containerID="cri-o://abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba" gracePeriod=10 Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.303271 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=19.303143715 podStartE2EDuration="19.303143715s" podCreationTimestamp="2026-02-19 19:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:09.295824168 +0000 UTC m=+1188.908174512" watchObservedRunningTime="2026-02-19 19:38:09.303143715 +0000 UTC m=+1188.915494039" Feb 19 19:38:09 crc kubenswrapper[4722]: I0219 19:38:09.820202 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.006218 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7qpx\" (UniqueName: \"kubernetes.io/projected/0987fde3-8329-4305-bd1c-efa7cf79306b-kube-api-access-t7qpx\") pod \"0987fde3-8329-4305-bd1c-efa7cf79306b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.006298 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-dns-svc\") pod \"0987fde3-8329-4305-bd1c-efa7cf79306b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.006372 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-sb\") pod \"0987fde3-8329-4305-bd1c-efa7cf79306b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.006446 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-nb\") pod \"0987fde3-8329-4305-bd1c-efa7cf79306b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.006467 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-config\") pod \"0987fde3-8329-4305-bd1c-efa7cf79306b\" (UID: \"0987fde3-8329-4305-bd1c-efa7cf79306b\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.016945 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0987fde3-8329-4305-bd1c-efa7cf79306b-kube-api-access-t7qpx" (OuterVolumeSpecName: "kube-api-access-t7qpx") pod "0987fde3-8329-4305-bd1c-efa7cf79306b" (UID: "0987fde3-8329-4305-bd1c-efa7cf79306b"). InnerVolumeSpecName "kube-api-access-t7qpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.070796 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0987fde3-8329-4305-bd1c-efa7cf79306b" (UID: "0987fde3-8329-4305-bd1c-efa7cf79306b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.089294 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0987fde3-8329-4305-bd1c-efa7cf79306b" (UID: "0987fde3-8329-4305-bd1c-efa7cf79306b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.101715 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-config" (OuterVolumeSpecName: "config") pod "0987fde3-8329-4305-bd1c-efa7cf79306b" (UID: "0987fde3-8329-4305-bd1c-efa7cf79306b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.109142 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7qpx\" (UniqueName: \"kubernetes.io/projected/0987fde3-8329-4305-bd1c-efa7cf79306b-kube-api-access-t7qpx\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.109209 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.109219 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.109228 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.135656 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0987fde3-8329-4305-bd1c-efa7cf79306b" (UID: "0987fde3-8329-4305-bd1c-efa7cf79306b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.211264 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0987fde3-8329-4305-bd1c-efa7cf79306b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.287845 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" event={"ID":"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0","Type":"ContainerStarted","Data":"ad4c618017bbd4becca3e0b0113a9facbc19857b41b1b6a0185965e3fe42e985"} Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.288042 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.297238 4722 generic.go:334] "Generic (PLEG): container finished" podID="b5e004a3-da53-4fbb-a396-52e33d205e2e" containerID="8abd067186838cbd1efbd6d007696dcd996ec432757392f167f24e47f4f57171" exitCode=0 Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.297278 4722 generic.go:334] "Generic (PLEG): container finished" podID="b5e004a3-da53-4fbb-a396-52e33d205e2e" containerID="26f23b94ceca02366d6ad7b5b51d95589832118420b7f024d6cc30a861e72a4d" exitCode=143 Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.297334 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5e004a3-da53-4fbb-a396-52e33d205e2e","Type":"ContainerDied","Data":"8abd067186838cbd1efbd6d007696dcd996ec432757392f167f24e47f4f57171"} Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.297397 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5e004a3-da53-4fbb-a396-52e33d205e2e","Type":"ContainerDied","Data":"26f23b94ceca02366d6ad7b5b51d95589832118420b7f024d6cc30a861e72a4d"} Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.297408 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5e004a3-da53-4fbb-a396-52e33d205e2e","Type":"ContainerDied","Data":"74d2769ab4752d1feeca0ef2edcd424d998dd7f01e76629b5fdbd1920be6013a"} Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.297419 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74d2769ab4752d1feeca0ef2edcd424d998dd7f01e76629b5fdbd1920be6013a" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.300754 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.303826 4722 generic.go:334] "Generic (PLEG): container finished" podID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" containerID="17e885ee19d45823afa31ec6273541ee2f4327ad3250b341ab5883d6c0baed3b" exitCode=0 Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.303856 4722 generic.go:334] "Generic (PLEG): container finished" podID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" containerID="0cb57e5ce54d4ebdcfc5834077ee30754bec175aed42d1c77310f409f5adb33c" exitCode=143 Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.303907 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e","Type":"ContainerDied","Data":"17e885ee19d45823afa31ec6273541ee2f4327ad3250b341ab5883d6c0baed3b"} Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.303938 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e","Type":"ContainerDied","Data":"0cb57e5ce54d4ebdcfc5834077ee30754bec175aed42d1c77310f409f5adb33c"} Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.303952 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e","Type":"ContainerDied","Data":"f05ea5f4636b64bd38579945e16464ca01ab6cde2bcc3d0ac468f593dd5c2f4e"} Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.303963 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f05ea5f4636b64bd38579945e16464ca01ab6cde2bcc3d0ac468f593dd5c2f4e" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.307423 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" podStartSLOduration=3.307411243 podStartE2EDuration="3.307411243s" podCreationTimestamp="2026-02-19 19:38:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:10.304712539 +0000 UTC m=+1189.917062863" watchObservedRunningTime="2026-02-19 19:38:10.307411243 +0000 UTC m=+1189.919761577" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.317688 4722 generic.go:334] "Generic (PLEG): container finished" podID="0987fde3-8329-4305-bd1c-efa7cf79306b" containerID="abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba" exitCode=0 Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.317741 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.317830 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" event={"ID":"0987fde3-8329-4305-bd1c-efa7cf79306b","Type":"ContainerDied","Data":"abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba"} Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.317860 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f84976bdf-8xk8b" event={"ID":"0987fde3-8329-4305-bd1c-efa7cf79306b","Type":"ContainerDied","Data":"dfd27acc06f5b6599cb43200558affa79dcb057ac010f2b3d993579ba443e434"} Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.317881 4722 scope.go:117] "RemoveContainer" containerID="abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.324253 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.384448 4722 scope.go:117] "RemoveContainer" containerID="981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.406124 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-8xk8b"] Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.415835 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-config-data\") pod \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.415905 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-internal-tls-certs\") pod \"b5e004a3-da53-4fbb-a396-52e33d205e2e\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.415928 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-httpd-run\") pod \"b5e004a3-da53-4fbb-a396-52e33d205e2e\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.415983 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-scripts\") pod \"b5e004a3-da53-4fbb-a396-52e33d205e2e\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416240 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-scripts\") pod \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416351 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-public-tls-certs\") pod \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416390 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-logs\") pod \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416422 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-combined-ca-bundle\") pod \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416519 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-logs\") pod \"b5e004a3-da53-4fbb-a396-52e33d205e2e\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416543 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-config-data\") pod \"b5e004a3-da53-4fbb-a396-52e33d205e2e\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416582 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vsfq\" (UniqueName: \"kubernetes.io/projected/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-kube-api-access-7vsfq\") pod \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416685 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416737 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-combined-ca-bundle\") pod \"b5e004a3-da53-4fbb-a396-52e33d205e2e\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416797 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"b5e004a3-da53-4fbb-a396-52e33d205e2e\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416827 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8lj4\" (UniqueName: \"kubernetes.io/projected/b5e004a3-da53-4fbb-a396-52e33d205e2e-kube-api-access-l8lj4\") pod \"b5e004a3-da53-4fbb-a396-52e33d205e2e\" (UID: \"b5e004a3-da53-4fbb-a396-52e33d205e2e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.416869 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-httpd-run\") pod \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\" (UID: \"95c1d6d9-8593-4f8b-a944-5fbbdd277e1e\") " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.419320 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" (UID: "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.419637 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-logs" (OuterVolumeSpecName: "logs") pod "b5e004a3-da53-4fbb-a396-52e33d205e2e" (UID: "b5e004a3-da53-4fbb-a396-52e33d205e2e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.419898 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-logs" (OuterVolumeSpecName: "logs") pod "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" (UID: "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.420661 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b5e004a3-da53-4fbb-a396-52e33d205e2e" (UID: "b5e004a3-da53-4fbb-a396-52e33d205e2e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.421971 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f84976bdf-8xk8b"] Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.422866 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-scripts" (OuterVolumeSpecName: "scripts") pod "b5e004a3-da53-4fbb-a396-52e33d205e2e" (UID: "b5e004a3-da53-4fbb-a396-52e33d205e2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.423144 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-kube-api-access-7vsfq" (OuterVolumeSpecName: "kube-api-access-7vsfq") pod "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" (UID: "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e"). InnerVolumeSpecName "kube-api-access-7vsfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.426051 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-scripts" (OuterVolumeSpecName: "scripts") pod "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" (UID: "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.426482 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e004a3-da53-4fbb-a396-52e33d205e2e-kube-api-access-l8lj4" (OuterVolumeSpecName: "kube-api-access-l8lj4") pod "b5e004a3-da53-4fbb-a396-52e33d205e2e" (UID: "b5e004a3-da53-4fbb-a396-52e33d205e2e"). InnerVolumeSpecName "kube-api-access-l8lj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.438626 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5" (OuterVolumeSpecName: "glance") pod "b5e004a3-da53-4fbb-a396-52e33d205e2e" (UID: "b5e004a3-da53-4fbb-a396-52e33d205e2e"). InnerVolumeSpecName "pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.439614 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d" (OuterVolumeSpecName: "glance") pod "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" (UID: "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e"). InnerVolumeSpecName "pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.455754 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" (UID: "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.471958 4722 scope.go:117] "RemoveContainer" containerID="abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba" Feb 19 19:38:10 crc kubenswrapper[4722]: E0219 19:38:10.473069 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba\": container with ID starting with abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba not found: ID does not exist" containerID="abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.473110 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba"} err="failed to get container status \"abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba\": rpc error: code = NotFound desc = could not find container \"abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba\": container with ID starting with abc29a4626d722f405e61fbade85b40838e2e66dc18a90564138b0bd727fc9ba not found: ID does not exist" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.473168 4722 scope.go:117] "RemoveContainer" containerID="981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9" Feb 19 19:38:10 crc kubenswrapper[4722]: E0219 19:38:10.473618 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9\": container with ID starting with 981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9 not found: ID does not exist" containerID="981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.473643 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9"} err="failed to get container status \"981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9\": rpc error: code = NotFound desc = could not find container \"981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9\": container with ID starting with 981631a1720074755eb76c16564828502332a44ba960e7e423fcb93aecc2d7a9 not found: ID does not exist" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.478961 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" (UID: "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.484220 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5e004a3-da53-4fbb-a396-52e33d205e2e" (UID: "b5e004a3-da53-4fbb-a396-52e33d205e2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.505786 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-config-data" (OuterVolumeSpecName: "config-data") pod "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" (UID: "95c1d6d9-8593-4f8b-a944-5fbbdd277e1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.506462 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-config-data" (OuterVolumeSpecName: "config-data") pod "b5e004a3-da53-4fbb-a396-52e33d205e2e" (UID: "b5e004a3-da53-4fbb-a396-52e33d205e2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.512131 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b5e004a3-da53-4fbb-a396-52e33d205e2e" (UID: "b5e004a3-da53-4fbb-a396-52e33d205e2e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.520959 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.520989 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521026 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") on node \"crc\" " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521040 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vsfq\" (UniqueName: \"kubernetes.io/projected/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-kube-api-access-7vsfq\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521053 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521068 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") on node \"crc\" " Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521078 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8lj4\" (UniqueName: \"kubernetes.io/projected/b5e004a3-da53-4fbb-a396-52e33d205e2e-kube-api-access-l8lj4\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521087 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521098 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521106 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521116 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5e004a3-da53-4fbb-a396-52e33d205e2e-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521125 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5e004a3-da53-4fbb-a396-52e33d205e2e-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521133 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521143 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521164 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.521173 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.544050 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.544286 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5") on node "crc" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.558563 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.558732 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d") on node "crc" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.588956 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.623562 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:10 crc kubenswrapper[4722]: I0219 19:38:10.623604 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.107496 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0987fde3-8329-4305-bd1c-efa7cf79306b" path="/var/lib/kubelet/pods/0987fde3-8329-4305-bd1c-efa7cf79306b/volumes" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.334963 4722 generic.go:334] "Generic (PLEG): container finished" podID="09a108ba-bb88-4799-a230-638cabf304b0" containerID="8fb5c1c0ec360aa5fc271ce7683847ce4ebe5cbb2a0793d19d34b7cc7bc220b8" exitCode=0 Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.336221 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ctdw7" event={"ID":"09a108ba-bb88-4799-a230-638cabf304b0","Type":"ContainerDied","Data":"8fb5c1c0ec360aa5fc271ce7683847ce4ebe5cbb2a0793d19d34b7cc7bc220b8"} Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.336647 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.337250 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.381032 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.399362 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.415038 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.423173 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.429769 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:38:11 crc kubenswrapper[4722]: E0219 19:38:11.430213 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" containerName="glance-httpd" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.430275 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" containerName="glance-httpd" Feb 19 19:38:11 crc kubenswrapper[4722]: E0219 19:38:11.430333 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e004a3-da53-4fbb-a396-52e33d205e2e" containerName="glance-httpd" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.430379 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e004a3-da53-4fbb-a396-52e33d205e2e" containerName="glance-httpd" Feb 19 19:38:11 crc kubenswrapper[4722]: E0219 19:38:11.430449 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0987fde3-8329-4305-bd1c-efa7cf79306b" containerName="init" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.430496 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0987fde3-8329-4305-bd1c-efa7cf79306b" containerName="init" Feb 19 19:38:11 crc kubenswrapper[4722]: E0219 19:38:11.430542 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e004a3-da53-4fbb-a396-52e33d205e2e" containerName="glance-log" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.430586 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e004a3-da53-4fbb-a396-52e33d205e2e" containerName="glance-log" Feb 19 19:38:11 crc kubenswrapper[4722]: E0219 19:38:11.430648 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" containerName="glance-log" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.430694 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" containerName="glance-log" Feb 19 19:38:11 crc kubenswrapper[4722]: E0219 19:38:11.430747 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0987fde3-8329-4305-bd1c-efa7cf79306b" containerName="dnsmasq-dns" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.430799 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0987fde3-8329-4305-bd1c-efa7cf79306b" containerName="dnsmasq-dns" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.431010 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e004a3-da53-4fbb-a396-52e33d205e2e" containerName="glance-httpd" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.431074 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e004a3-da53-4fbb-a396-52e33d205e2e" containerName="glance-log" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.431129 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0987fde3-8329-4305-bd1c-efa7cf79306b" containerName="dnsmasq-dns" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.431200 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" containerName="glance-log" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.431250 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" containerName="glance-httpd" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.432226 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.446165 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.447569 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.447699 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.458197 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9s8kl" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.458516 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.458785 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.459056 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.459365 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.459182 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.476039 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.556448 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g46np\" (UniqueName: \"kubernetes.io/projected/58e51a47-7d37-46de-96cc-609365fab496-kube-api-access-g46np\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.556533 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-config-data\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.556677 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-scripts\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.556744 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.556795 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.556822 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-config-data\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.556845 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.556929 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.557060 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.557096 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.557204 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6q6t\" (UniqueName: \"kubernetes.io/projected/74c5d98f-45b4-4fd8-876b-3471da720a4b-kube-api-access-h6q6t\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.557485 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-scripts\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.557750 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-logs\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.557782 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.557866 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-logs\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.557960 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659056 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659110 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659135 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-config-data\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659164 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659184 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659200 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659219 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659255 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6q6t\" (UniqueName: \"kubernetes.io/projected/74c5d98f-45b4-4fd8-876b-3471da720a4b-kube-api-access-h6q6t\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659277 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-scripts\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659307 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-logs\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659326 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659351 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-logs\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659395 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659431 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g46np\" (UniqueName: \"kubernetes.io/projected/58e51a47-7d37-46de-96cc-609365fab496-kube-api-access-g46np\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659475 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-config-data\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659497 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-scripts\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.659673 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.660060 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-logs\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.660082 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-logs\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.660772 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.664482 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.664539 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b323df4ccd136fd865256cd83fe693e56c32fbc8a05d96b41caf6babb703da86/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.664749 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.665530 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.665569 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2019fecbddc337ddf53783637eb0008bc901e49a55294deb1e2d06fbb77c3ae3/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.666955 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-scripts\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.669351 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-config-data\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.670946 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.671193 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.676481 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-scripts\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.676534 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-config-data\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.679891 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g46np\" (UniqueName: \"kubernetes.io/projected/58e51a47-7d37-46de-96cc-609365fab496-kube-api-access-g46np\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.680775 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.688418 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6q6t\" (UniqueName: \"kubernetes.io/projected/74c5d98f-45b4-4fd8-876b-3471da720a4b-kube-api-access-h6q6t\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.719355 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.723483 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " pod="openstack/glance-default-external-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.783300 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.799190 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.799245 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:38:11 crc kubenswrapper[4722]: I0219 19:38:11.807028 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 19:38:13 crc kubenswrapper[4722]: I0219 19:38:13.085440 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95c1d6d9-8593-4f8b-a944-5fbbdd277e1e" path="/var/lib/kubelet/pods/95c1d6d9-8593-4f8b-a944-5fbbdd277e1e/volumes" Feb 19 19:38:13 crc kubenswrapper[4722]: I0219 19:38:13.086860 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5e004a3-da53-4fbb-a396-52e33d205e2e" path="/var/lib/kubelet/pods/b5e004a3-da53-4fbb-a396-52e33d205e2e/volumes" Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.895376 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.941598 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-fernet-keys\") pod \"09a108ba-bb88-4799-a230-638cabf304b0\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.941668 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-config-data\") pod \"09a108ba-bb88-4799-a230-638cabf304b0\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.941701 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xjh6\" (UniqueName: \"kubernetes.io/projected/09a108ba-bb88-4799-a230-638cabf304b0-kube-api-access-4xjh6\") pod \"09a108ba-bb88-4799-a230-638cabf304b0\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.941733 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-credential-keys\") pod \"09a108ba-bb88-4799-a230-638cabf304b0\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.941778 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-scripts\") pod \"09a108ba-bb88-4799-a230-638cabf304b0\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.941839 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-combined-ca-bundle\") pod \"09a108ba-bb88-4799-a230-638cabf304b0\" (UID: \"09a108ba-bb88-4799-a230-638cabf304b0\") " Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.949325 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "09a108ba-bb88-4799-a230-638cabf304b0" (UID: "09a108ba-bb88-4799-a230-638cabf304b0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.949390 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "09a108ba-bb88-4799-a230-638cabf304b0" (UID: "09a108ba-bb88-4799-a230-638cabf304b0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.949822 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-scripts" (OuterVolumeSpecName: "scripts") pod "09a108ba-bb88-4799-a230-638cabf304b0" (UID: "09a108ba-bb88-4799-a230-638cabf304b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.951134 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09a108ba-bb88-4799-a230-638cabf304b0-kube-api-access-4xjh6" (OuterVolumeSpecName: "kube-api-access-4xjh6") pod "09a108ba-bb88-4799-a230-638cabf304b0" (UID: "09a108ba-bb88-4799-a230-638cabf304b0"). InnerVolumeSpecName "kube-api-access-4xjh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.974005 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-config-data" (OuterVolumeSpecName: "config-data") pod "09a108ba-bb88-4799-a230-638cabf304b0" (UID: "09a108ba-bb88-4799-a230-638cabf304b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:15 crc kubenswrapper[4722]: I0219 19:38:15.980359 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09a108ba-bb88-4799-a230-638cabf304b0" (UID: "09a108ba-bb88-4799-a230-638cabf304b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.043017 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.043393 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xjh6\" (UniqueName: \"kubernetes.io/projected/09a108ba-bb88-4799-a230-638cabf304b0-kube-api-access-4xjh6\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.043405 4722 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.043415 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.043423 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.043431 4722 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/09a108ba-bb88-4799-a230-638cabf304b0-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.381406 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ctdw7" event={"ID":"09a108ba-bb88-4799-a230-638cabf304b0","Type":"ContainerDied","Data":"9d69e23e43e8ab2aa747e1b227270b4fab24359a7aa862c4ab858b12cf3f9985"} Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.381460 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ctdw7" Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.381463 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d69e23e43e8ab2aa747e1b227270b4fab24359a7aa862c4ab858b12cf3f9985" Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.988257 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ctdw7"] Feb 19 19:38:16 crc kubenswrapper[4722]: I0219 19:38:16.996091 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ctdw7"] Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.067470 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-f6zx8"] Feb 19 19:38:17 crc kubenswrapper[4722]: E0219 19:38:17.068345 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a108ba-bb88-4799-a230-638cabf304b0" containerName="keystone-bootstrap" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.068496 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a108ba-bb88-4799-a230-638cabf304b0" containerName="keystone-bootstrap" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.069539 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="09a108ba-bb88-4799-a230-638cabf304b0" containerName="keystone-bootstrap" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.072094 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.074236 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.074386 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.074486 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qhj8b" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.074600 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.110452 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09a108ba-bb88-4799-a230-638cabf304b0" path="/var/lib/kubelet/pods/09a108ba-bb88-4799-a230-638cabf304b0/volumes" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.111530 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-f6zx8"] Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.168459 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-config-data\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.168493 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-combined-ca-bundle\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.168553 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-credential-keys\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.168572 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdxx8\" (UniqueName: \"kubernetes.io/projected/6175472a-2fd6-4b07-bcb1-4e441a4587aa-kube-api-access-kdxx8\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.168646 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-fernet-keys\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.168691 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-scripts\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.269898 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-scripts\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.269998 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-config-data\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.270027 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-combined-ca-bundle\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.270113 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-credential-keys\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.270179 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdxx8\" (UniqueName: \"kubernetes.io/projected/6175472a-2fd6-4b07-bcb1-4e441a4587aa-kube-api-access-kdxx8\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.270286 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-fernet-keys\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.274329 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-combined-ca-bundle\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.274801 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-fernet-keys\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.277781 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-config-data\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.282172 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-credential-keys\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.285697 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-scripts\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.290813 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdxx8\" (UniqueName: \"kubernetes.io/projected/6175472a-2fd6-4b07-bcb1-4e441a4587aa-kube-api-access-kdxx8\") pod \"keystone-bootstrap-f6zx8\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.395526 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.824401 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.881626 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jvpfv"] Feb 19 19:38:17 crc kubenswrapper[4722]: I0219 19:38:17.881871 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-jvpfv" podUID="b12e3334-cc75-47af-870a-3d86164cb249" containerName="dnsmasq-dns" containerID="cri-o://acdda2995a7c01c2bb56033df969b7a728b55472a6cb4f9472db1062d35bc9c3" gracePeriod=10 Feb 19 19:38:18 crc kubenswrapper[4722]: I0219 19:38:18.401820 4722 generic.go:334] "Generic (PLEG): container finished" podID="b12e3334-cc75-47af-870a-3d86164cb249" containerID="acdda2995a7c01c2bb56033df969b7a728b55472a6cb4f9472db1062d35bc9c3" exitCode=0 Feb 19 19:38:18 crc kubenswrapper[4722]: I0219 19:38:18.401904 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jvpfv" event={"ID":"b12e3334-cc75-47af-870a-3d86164cb249","Type":"ContainerDied","Data":"acdda2995a7c01c2bb56033df969b7a728b55472a6cb4f9472db1062d35bc9c3"} Feb 19 19:38:19 crc kubenswrapper[4722]: I0219 19:38:19.672064 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jvpfv" podUID="b12e3334-cc75-47af-870a-3d86164cb249" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Feb 19 19:38:20 crc kubenswrapper[4722]: I0219 19:38:20.589512 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 19 19:38:20 crc kubenswrapper[4722]: I0219 19:38:20.596299 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 19 19:38:21 crc kubenswrapper[4722]: I0219 19:38:21.436970 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 19 19:38:24 crc kubenswrapper[4722]: I0219 19:38:24.458379 4722 generic.go:334] "Generic (PLEG): container finished" podID="eab1ce59-2254-419a-bab0-cf5e87888634" containerID="a1c03548ff56ab3102ffaa64e0990092747adeddc1030d3c048e1f3f59e0095b" exitCode=0 Feb 19 19:38:24 crc kubenswrapper[4722]: I0219 19:38:24.458456 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7b98l" event={"ID":"eab1ce59-2254-419a-bab0-cf5e87888634","Type":"ContainerDied","Data":"a1c03548ff56ab3102ffaa64e0990092747adeddc1030d3c048e1f3f59e0095b"} Feb 19 19:38:24 crc kubenswrapper[4722]: I0219 19:38:24.672570 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jvpfv" podUID="b12e3334-cc75-47af-870a-3d86164cb249" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Feb 19 19:38:29 crc kubenswrapper[4722]: I0219 19:38:29.673278 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jvpfv" podUID="b12e3334-cc75-47af-870a-3d86164cb249" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Feb 19 19:38:29 crc kubenswrapper[4722]: I0219 19:38:29.674119 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.573851 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7b98l" event={"ID":"eab1ce59-2254-419a-bab0-cf5e87888634","Type":"ContainerDied","Data":"faf928cc455dc359a2347459ccaaf8574498adb59570eba6f898fdc7c69b0cd6"} Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.574141 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faf928cc455dc359a2347459ccaaf8574498adb59570eba6f898fdc7c69b0cd6" Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.672090 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.774225 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n4nt\" (UniqueName: \"kubernetes.io/projected/eab1ce59-2254-419a-bab0-cf5e87888634-kube-api-access-4n4nt\") pod \"eab1ce59-2254-419a-bab0-cf5e87888634\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.774362 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-config\") pod \"eab1ce59-2254-419a-bab0-cf5e87888634\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.774457 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-combined-ca-bundle\") pod \"eab1ce59-2254-419a-bab0-cf5e87888634\" (UID: \"eab1ce59-2254-419a-bab0-cf5e87888634\") " Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.800399 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eab1ce59-2254-419a-bab0-cf5e87888634-kube-api-access-4n4nt" (OuterVolumeSpecName: "kube-api-access-4n4nt") pod "eab1ce59-2254-419a-bab0-cf5e87888634" (UID: "eab1ce59-2254-419a-bab0-cf5e87888634"). InnerVolumeSpecName "kube-api-access-4n4nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.812454 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-config" (OuterVolumeSpecName: "config") pod "eab1ce59-2254-419a-bab0-cf5e87888634" (UID: "eab1ce59-2254-419a-bab0-cf5e87888634"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.821253 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eab1ce59-2254-419a-bab0-cf5e87888634" (UID: "eab1ce59-2254-419a-bab0-cf5e87888634"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.896395 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.896739 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n4nt\" (UniqueName: \"kubernetes.io/projected/eab1ce59-2254-419a-bab0-cf5e87888634-kube-api-access-4n4nt\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:31 crc kubenswrapper[4722]: I0219 19:38:31.896756 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/eab1ce59-2254-419a-bab0-cf5e87888634-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:32 crc kubenswrapper[4722]: I0219 19:38:32.131413 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:38:32 crc kubenswrapper[4722]: I0219 19:38:32.581780 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7b98l" Feb 19 19:38:32 crc kubenswrapper[4722]: E0219 19:38:32.860029 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 19 19:38:32 crc kubenswrapper[4722]: E0219 19:38:32.860214 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cdjl8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-nldcm_openstack(512a4c5e-3ea6-42a8-9f83-8c0e5375891d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:38:32 crc kubenswrapper[4722]: E0219 19:38:32.862175 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-nldcm" podUID="512a4c5e-3ea6-42a8-9f83-8c0e5375891d" Feb 19 19:38:32 crc kubenswrapper[4722]: I0219 19:38:32.944534 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-hcfgw"] Feb 19 19:38:32 crc kubenswrapper[4722]: E0219 19:38:32.944988 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eab1ce59-2254-419a-bab0-cf5e87888634" containerName="neutron-db-sync" Feb 19 19:38:32 crc kubenswrapper[4722]: I0219 19:38:32.945005 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab1ce59-2254-419a-bab0-cf5e87888634" containerName="neutron-db-sync" Feb 19 19:38:32 crc kubenswrapper[4722]: I0219 19:38:32.945209 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="eab1ce59-2254-419a-bab0-cf5e87888634" containerName="neutron-db-sync" Feb 19 19:38:32 crc kubenswrapper[4722]: I0219 19:38:32.946264 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:32 crc kubenswrapper[4722]: I0219 19:38:32.958309 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-hcfgw"] Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.014672 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsg42\" (UniqueName: \"kubernetes.io/projected/f618be57-2b9f-4455-8de0-90379bc9d57b-kube-api-access-jsg42\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.014719 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.014766 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-config\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.014809 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-svc\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.014866 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.014891 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.100743 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7445db86-7r6w9"] Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.102368 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.106602 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.106655 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.106615 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4wknf" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.106916 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.112265 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7445db86-7r6w9"] Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.115891 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsg42\" (UniqueName: \"kubernetes.io/projected/f618be57-2b9f-4455-8de0-90379bc9d57b-kube-api-access-jsg42\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.115950 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.116002 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-config\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.116048 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-svc\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.116094 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.116119 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.116938 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.116971 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-config\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.117082 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-svc\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.117299 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.117774 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.151813 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsg42\" (UniqueName: \"kubernetes.io/projected/f618be57-2b9f-4455-8de0-90379bc9d57b-kube-api-access-jsg42\") pod \"dnsmasq-dns-55f844cf75-hcfgw\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.218301 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-config\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.218361 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-ovndb-tls-certs\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.218416 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbx7r\" (UniqueName: \"kubernetes.io/projected/cff58b5f-4c6b-44be-b668-15b2948e6af0-kube-api-access-dbx7r\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.218514 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-combined-ca-bundle\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.218637 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-httpd-config\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.271834 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.321039 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-httpd-config\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.321179 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-config\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.321209 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-ovndb-tls-certs\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.321242 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbx7r\" (UniqueName: \"kubernetes.io/projected/cff58b5f-4c6b-44be-b668-15b2948e6af0-kube-api-access-dbx7r\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.321300 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-combined-ca-bundle\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.325368 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-httpd-config\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.329737 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-combined-ca-bundle\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.331043 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-config\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.332385 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-ovndb-tls-certs\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.339422 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbx7r\" (UniqueName: \"kubernetes.io/projected/cff58b5f-4c6b-44be-b668-15b2948e6af0-kube-api-access-dbx7r\") pod \"neutron-7445db86-7r6w9\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:33 crc kubenswrapper[4722]: I0219 19:38:33.423956 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:34 crc kubenswrapper[4722]: E0219 19:38:34.282790 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-nldcm" podUID="512a4c5e-3ea6-42a8-9f83-8c0e5375891d" Feb 19 19:38:34 crc kubenswrapper[4722]: E0219 19:38:34.983526 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 19 19:38:34 crc kubenswrapper[4722]: E0219 19:38:34.983681 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dvk66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-lnf5k_openstack(9c2453a9-4c81-4256-b52d-edb69c12c7d7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:38:34 crc kubenswrapper[4722]: E0219 19:38:34.984827 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-lnf5k" podUID="9c2453a9-4c81-4256-b52d-edb69c12c7d7" Feb 19 19:38:35 crc kubenswrapper[4722]: W0219 19:38:35.000791 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74c5d98f_45b4_4fd8_876b_3471da720a4b.slice/crio-8e47c9d091e13fff038509a7c2d6d944fe2c289a1a215ce8e16c3e4cee4c648d WatchSource:0}: Error finding container 8e47c9d091e13fff038509a7c2d6d944fe2c289a1a215ce8e16c3e4cee4c648d: Status 404 returned error can't find the container with id 8e47c9d091e13fff038509a7c2d6d944fe2c289a1a215ce8e16c3e4cee4c648d Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.121909 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.253814 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-sb\") pod \"b12e3334-cc75-47af-870a-3d86164cb249\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.253884 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs5nx\" (UniqueName: \"kubernetes.io/projected/b12e3334-cc75-47af-870a-3d86164cb249-kube-api-access-bs5nx\") pod \"b12e3334-cc75-47af-870a-3d86164cb249\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.253921 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-dns-svc\") pod \"b12e3334-cc75-47af-870a-3d86164cb249\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.254126 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-config\") pod \"b12e3334-cc75-47af-870a-3d86164cb249\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.254220 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-nb\") pod \"b12e3334-cc75-47af-870a-3d86164cb249\" (UID: \"b12e3334-cc75-47af-870a-3d86164cb249\") " Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.261760 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b12e3334-cc75-47af-870a-3d86164cb249-kube-api-access-bs5nx" (OuterVolumeSpecName: "kube-api-access-bs5nx") pod "b12e3334-cc75-47af-870a-3d86164cb249" (UID: "b12e3334-cc75-47af-870a-3d86164cb249"). InnerVolumeSpecName "kube-api-access-bs5nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.283967 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-jvpfv" event={"ID":"b12e3334-cc75-47af-870a-3d86164cb249","Type":"ContainerDied","Data":"1028e5969f0d7dbc2c219bf0143cef7647b9346e4f36673b7b607399975bc325"} Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.284046 4722 scope.go:117] "RemoveContainer" containerID="acdda2995a7c01c2bb56033df969b7a728b55472a6cb4f9472db1062d35bc9c3" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.284226 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-jvpfv" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.290569 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74c5d98f-45b4-4fd8-876b-3471da720a4b","Type":"ContainerStarted","Data":"8e47c9d091e13fff038509a7c2d6d944fe2c289a1a215ce8e16c3e4cee4c648d"} Feb 19 19:38:35 crc kubenswrapper[4722]: E0219 19:38:35.291564 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-lnf5k" podUID="9c2453a9-4c81-4256-b52d-edb69c12c7d7" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.320843 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b12e3334-cc75-47af-870a-3d86164cb249" (UID: "b12e3334-cc75-47af-870a-3d86164cb249"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.324557 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-config" (OuterVolumeSpecName: "config") pod "b12e3334-cc75-47af-870a-3d86164cb249" (UID: "b12e3334-cc75-47af-870a-3d86164cb249"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.340211 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b12e3334-cc75-47af-870a-3d86164cb249" (UID: "b12e3334-cc75-47af-870a-3d86164cb249"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.342858 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b12e3334-cc75-47af-870a-3d86164cb249" (UID: "b12e3334-cc75-47af-870a-3d86164cb249"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.356187 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.356219 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.356229 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs5nx\" (UniqueName: \"kubernetes.io/projected/b12e3334-cc75-47af-870a-3d86164cb249-kube-api-access-bs5nx\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.356239 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.356250 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12e3334-cc75-47af-870a-3d86164cb249-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.636023 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jvpfv"] Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.651668 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-jvpfv"] Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.870870 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d74fd689-q5qhb"] Feb 19 19:38:35 crc kubenswrapper[4722]: E0219 19:38:35.871589 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12e3334-cc75-47af-870a-3d86164cb249" containerName="init" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.871616 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12e3334-cc75-47af-870a-3d86164cb249" containerName="init" Feb 19 19:38:35 crc kubenswrapper[4722]: E0219 19:38:35.871638 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12e3334-cc75-47af-870a-3d86164cb249" containerName="dnsmasq-dns" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.871646 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12e3334-cc75-47af-870a-3d86164cb249" containerName="dnsmasq-dns" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.871883 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b12e3334-cc75-47af-870a-3d86164cb249" containerName="dnsmasq-dns" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.873092 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.884371 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.884923 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.893585 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d74fd689-q5qhb"] Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.967396 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-combined-ca-bundle\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.967439 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2qkz\" (UniqueName: \"kubernetes.io/projected/5c88f138-094d-44c0-b1c9-1492e7e11e9b-kube-api-access-c2qkz\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.967464 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-config\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.967534 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-public-tls-certs\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.967553 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-httpd-config\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.967583 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-internal-tls-certs\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:35 crc kubenswrapper[4722]: I0219 19:38:35.967602 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-ovndb-tls-certs\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.069808 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-combined-ca-bundle\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.069881 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2qkz\" (UniqueName: \"kubernetes.io/projected/5c88f138-094d-44c0-b1c9-1492e7e11e9b-kube-api-access-c2qkz\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.069908 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-config\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.070033 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-public-tls-certs\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.070061 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-httpd-config\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.070114 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-internal-tls-certs\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.070177 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-ovndb-tls-certs\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.075493 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-httpd-config\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.076275 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-combined-ca-bundle\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.076417 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-public-tls-certs\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.077332 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-config\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.079576 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-internal-tls-certs\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.082608 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-ovndb-tls-certs\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.089758 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2qkz\" (UniqueName: \"kubernetes.io/projected/5c88f138-094d-44c0-b1c9-1492e7e11e9b-kube-api-access-c2qkz\") pod \"neutron-d74fd689-q5qhb\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:36 crc kubenswrapper[4722]: I0219 19:38:36.195048 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:37 crc kubenswrapper[4722]: I0219 19:38:37.081941 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b12e3334-cc75-47af-870a-3d86164cb249" path="/var/lib/kubelet/pods/b12e3334-cc75-47af-870a-3d86164cb249/volumes" Feb 19 19:38:37 crc kubenswrapper[4722]: I0219 19:38:37.898513 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:38:39 crc kubenswrapper[4722]: I0219 19:38:39.672278 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-jvpfv" podUID="b12e3334-cc75-47af-870a-3d86164cb249" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: i/o timeout" Feb 19 19:38:39 crc kubenswrapper[4722]: W0219 19:38:39.975444 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58e51a47_7d37_46de_96cc_609365fab496.slice/crio-9d53207634e6b7ef9226749da4be244094bd8e2655c281755c661fa33e7511ac WatchSource:0}: Error finding container 9d53207634e6b7ef9226749da4be244094bd8e2655c281755c661fa33e7511ac: Status 404 returned error can't find the container with id 9d53207634e6b7ef9226749da4be244094bd8e2655c281755c661fa33e7511ac Feb 19 19:38:40 crc kubenswrapper[4722]: I0219 19:38:40.338254 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58e51a47-7d37-46de-96cc-609365fab496","Type":"ContainerStarted","Data":"9d53207634e6b7ef9226749da4be244094bd8e2655c281755c661fa33e7511ac"} Feb 19 19:38:40 crc kubenswrapper[4722]: I0219 19:38:40.425122 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-f6zx8"] Feb 19 19:38:40 crc kubenswrapper[4722]: I0219 19:38:40.552248 4722 scope.go:117] "RemoveContainer" containerID="58f8459d38255bc0ee2a3b1d7c9b5ab8e43bfd9e3de2e5dd8ef6021c2a7233ed" Feb 19 19:38:40 crc kubenswrapper[4722]: W0219 19:38:40.591514 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6175472a_2fd6_4b07_bcb1_4e441a4587aa.slice/crio-11171954e52f156eda5afe4276f6db2ca22ce2d1145e589fdb8708cc26950049 WatchSource:0}: Error finding container 11171954e52f156eda5afe4276f6db2ca22ce2d1145e589fdb8708cc26950049: Status 404 returned error can't find the container with id 11171954e52f156eda5afe4276f6db2ca22ce2d1145e589fdb8708cc26950049 Feb 19 19:38:40 crc kubenswrapper[4722]: E0219 19:38:40.601945 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 19 19:38:40 crc kubenswrapper[4722]: E0219 19:38:40.601994 4722 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 19 19:38:40 crc kubenswrapper[4722]: E0219 19:38:40.602202 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l7zht,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-xdgs2_openstack(fb399ce1-7269-4d99-9140-0d1d33a6fd6a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 19:38:40 crc kubenswrapper[4722]: E0219 19:38:40.603357 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-xdgs2" podUID="fb399ce1-7269-4d99-9140-0d1d33a6fd6a" Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.089584 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-hcfgw"] Feb 19 19:38:41 crc kubenswrapper[4722]: W0219 19:38:41.132047 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf618be57_2b9f_4455_8de0_90379bc9d57b.slice/crio-b45faf8bb73a0e07ec3500177daa08ffabc04115f5244bdef2acc1c1f815aaea WatchSource:0}: Error finding container b45faf8bb73a0e07ec3500177daa08ffabc04115f5244bdef2acc1c1f815aaea: Status 404 returned error can't find the container with id b45faf8bb73a0e07ec3500177daa08ffabc04115f5244bdef2acc1c1f815aaea Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.304501 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d74fd689-q5qhb"] Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.349020 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16","Type":"ContainerStarted","Data":"6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14"} Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.352268 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" event={"ID":"f618be57-2b9f-4455-8de0-90379bc9d57b","Type":"ContainerStarted","Data":"b45faf8bb73a0e07ec3500177daa08ffabc04115f5244bdef2acc1c1f815aaea"} Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.353589 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zrwzj" event={"ID":"41216a8d-32f8-4ec6-ab65-5474453cad03","Type":"ContainerStarted","Data":"90f4e39d24966e113ef88317b89ebc0b17164774e86b8e7cdf9bced518e5ecd6"} Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.362052 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f6zx8" event={"ID":"6175472a-2fd6-4b07-bcb1-4e441a4587aa","Type":"ContainerStarted","Data":"c6a2c92ed1dfd6a529b0d6c2d06234eb6f8f5c4b6c0afa3fd878de3dc02ea9ee"} Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.362139 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f6zx8" event={"ID":"6175472a-2fd6-4b07-bcb1-4e441a4587aa","Type":"ContainerStarted","Data":"11171954e52f156eda5afe4276f6db2ca22ce2d1145e589fdb8708cc26950049"} Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.365200 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d74fd689-q5qhb" event={"ID":"5c88f138-094d-44c0-b1c9-1492e7e11e9b","Type":"ContainerStarted","Data":"245b2a4bf08b03ca07fdc608528d3501f8e470227ac611d75e1e28818470fe64"} Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.370241 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-zrwzj" podStartSLOduration=8.316199213 podStartE2EDuration="38.370226939s" podCreationTimestamp="2026-02-19 19:38:03 +0000 UTC" firstStartedPulling="2026-02-19 19:38:04.947520156 +0000 UTC m=+1184.559870480" lastFinishedPulling="2026-02-19 19:38:35.001547882 +0000 UTC m=+1214.613898206" observedRunningTime="2026-02-19 19:38:41.368463903 +0000 UTC m=+1220.980814247" watchObservedRunningTime="2026-02-19 19:38:41.370226939 +0000 UTC m=+1220.982577253" Feb 19 19:38:41 crc kubenswrapper[4722]: E0219 19:38:41.378645 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-xdgs2" podUID="fb399ce1-7269-4d99-9140-0d1d33a6fd6a" Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.399456 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-f6zx8" podStartSLOduration=24.399424066999998 podStartE2EDuration="24.399424067s" podCreationTimestamp="2026-02-19 19:38:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:41.388426674 +0000 UTC m=+1221.000776999" watchObservedRunningTime="2026-02-19 19:38:41.399424067 +0000 UTC m=+1221.011774391" Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.421118 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7445db86-7r6w9"] Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.805357 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:38:41 crc kubenswrapper[4722]: I0219 19:38:41.805660 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.390336 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58e51a47-7d37-46de-96cc-609365fab496","Type":"ContainerStarted","Data":"1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7"} Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.390981 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58e51a47-7d37-46de-96cc-609365fab496","Type":"ContainerStarted","Data":"b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd"} Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.395702 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d74fd689-q5qhb" event={"ID":"5c88f138-094d-44c0-b1c9-1492e7e11e9b","Type":"ContainerStarted","Data":"6cecb6c27a5d8a3d6ffee2f1f0d633c671295bd59fc22535a5bf9eb9959995c0"} Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.395736 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d74fd689-q5qhb" event={"ID":"5c88f138-094d-44c0-b1c9-1492e7e11e9b","Type":"ContainerStarted","Data":"e28ba51730232a08a3cd5dc96327f73be33823e4b79e43d0c66d0800f455e9e0"} Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.396533 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.402597 4722 generic.go:334] "Generic (PLEG): container finished" podID="f618be57-2b9f-4455-8de0-90379bc9d57b" containerID="4dec94c6774384698a0cf861b554d74fb1ddd8514338b3e11d17056ce861d124" exitCode=0 Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.402829 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" event={"ID":"f618be57-2b9f-4455-8de0-90379bc9d57b","Type":"ContainerDied","Data":"4dec94c6774384698a0cf861b554d74fb1ddd8514338b3e11d17056ce861d124"} Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.409811 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74c5d98f-45b4-4fd8-876b-3471da720a4b","Type":"ContainerStarted","Data":"3ce9bc56dc0250472fbd7d818bb628d5fdf7798657a6fd7b1570bd5c3b64c1ae"} Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.409847 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74c5d98f-45b4-4fd8-876b-3471da720a4b","Type":"ContainerStarted","Data":"0f3ddcaf8c81704eaf6b201c98a6bdf76e2b380c4dac2d9db9d77cb9f737e62a"} Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.422650 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7445db86-7r6w9" event={"ID":"cff58b5f-4c6b-44be-b668-15b2948e6af0","Type":"ContainerStarted","Data":"df36524cd2a523caf0ae3f85ddef265e7c54e5ba8fa2da85c3fd083ca4ebd887"} Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.422691 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7445db86-7r6w9" event={"ID":"cff58b5f-4c6b-44be-b668-15b2948e6af0","Type":"ContainerStarted","Data":"6956d55506ad813de368c67533400189dca7fad85038770d3e67703d4229d5da"} Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.422703 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7445db86-7r6w9" event={"ID":"cff58b5f-4c6b-44be-b668-15b2948e6af0","Type":"ContainerStarted","Data":"0d269d0087152d6edd92c6c1c2324f5e6566d6cbbbcd03d88628b974769fb6f5"} Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.422717 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.451852 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-d74fd689-q5qhb" podStartSLOduration=7.451833543 podStartE2EDuration="7.451833543s" podCreationTimestamp="2026-02-19 19:38:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:42.444307259 +0000 UTC m=+1222.056657583" watchObservedRunningTime="2026-02-19 19:38:42.451833543 +0000 UTC m=+1222.064183857" Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.458402 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=31.458389777 podStartE2EDuration="31.458389777s" podCreationTimestamp="2026-02-19 19:38:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:42.421449028 +0000 UTC m=+1222.033799352" watchObservedRunningTime="2026-02-19 19:38:42.458389777 +0000 UTC m=+1222.070740101" Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.494045 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=31.494029246 podStartE2EDuration="31.494029246s" podCreationTimestamp="2026-02-19 19:38:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:42.493245632 +0000 UTC m=+1222.105595956" watchObservedRunningTime="2026-02-19 19:38:42.494029246 +0000 UTC m=+1222.106379570" Feb 19 19:38:42 crc kubenswrapper[4722]: I0219 19:38:42.523988 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7445db86-7r6w9" podStartSLOduration=9.523958768 podStartE2EDuration="9.523958768s" podCreationTimestamp="2026-02-19 19:38:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:42.508666562 +0000 UTC m=+1222.121016906" watchObservedRunningTime="2026-02-19 19:38:42.523958768 +0000 UTC m=+1222.136309092" Feb 19 19:38:46 crc kubenswrapper[4722]: I0219 19:38:46.464616 4722 generic.go:334] "Generic (PLEG): container finished" podID="6175472a-2fd6-4b07-bcb1-4e441a4587aa" containerID="c6a2c92ed1dfd6a529b0d6c2d06234eb6f8f5c4b6c0afa3fd878de3dc02ea9ee" exitCode=0 Feb 19 19:38:46 crc kubenswrapper[4722]: I0219 19:38:46.464909 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f6zx8" event={"ID":"6175472a-2fd6-4b07-bcb1-4e441a4587aa","Type":"ContainerDied","Data":"c6a2c92ed1dfd6a529b0d6c2d06234eb6f8f5c4b6c0afa3fd878de3dc02ea9ee"} Feb 19 19:38:46 crc kubenswrapper[4722]: I0219 19:38:46.471002 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" event={"ID":"f618be57-2b9f-4455-8de0-90379bc9d57b","Type":"ContainerStarted","Data":"0546c702603104f43bbaaf99f3fe718c40fad148666fb0d4d8b70707d6802f06"} Feb 19 19:38:46 crc kubenswrapper[4722]: I0219 19:38:46.471124 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:46 crc kubenswrapper[4722]: I0219 19:38:46.472956 4722 generic.go:334] "Generic (PLEG): container finished" podID="41216a8d-32f8-4ec6-ab65-5474453cad03" containerID="90f4e39d24966e113ef88317b89ebc0b17164774e86b8e7cdf9bced518e5ecd6" exitCode=0 Feb 19 19:38:46 crc kubenswrapper[4722]: I0219 19:38:46.473008 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zrwzj" event={"ID":"41216a8d-32f8-4ec6-ab65-5474453cad03","Type":"ContainerDied","Data":"90f4e39d24966e113ef88317b89ebc0b17164774e86b8e7cdf9bced518e5ecd6"} Feb 19 19:38:46 crc kubenswrapper[4722]: I0219 19:38:46.486967 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16","Type":"ContainerStarted","Data":"891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351"} Feb 19 19:38:46 crc kubenswrapper[4722]: I0219 19:38:46.502517 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" podStartSLOduration=14.502500304 podStartE2EDuration="14.502500304s" podCreationTimestamp="2026-02-19 19:38:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:46.50205859 +0000 UTC m=+1226.114408924" watchObservedRunningTime="2026-02-19 19:38:46.502500304 +0000 UTC m=+1226.114850628" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.142454 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.151618 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.311889 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-scripts\") pod \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.312411 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-fernet-keys\") pod \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.312474 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-combined-ca-bundle\") pod \"41216a8d-32f8-4ec6-ab65-5474453cad03\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.312523 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-credential-keys\") pod \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.312597 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41216a8d-32f8-4ec6-ab65-5474453cad03-logs\") pod \"41216a8d-32f8-4ec6-ab65-5474453cad03\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.312639 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-scripts\") pod \"41216a8d-32f8-4ec6-ab65-5474453cad03\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.312666 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdxx8\" (UniqueName: \"kubernetes.io/projected/6175472a-2fd6-4b07-bcb1-4e441a4587aa-kube-api-access-kdxx8\") pod \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.312699 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-combined-ca-bundle\") pod \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.312752 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-config-data\") pod \"41216a8d-32f8-4ec6-ab65-5474453cad03\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.312777 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmgm9\" (UniqueName: \"kubernetes.io/projected/41216a8d-32f8-4ec6-ab65-5474453cad03-kube-api-access-kmgm9\") pod \"41216a8d-32f8-4ec6-ab65-5474453cad03\" (UID: \"41216a8d-32f8-4ec6-ab65-5474453cad03\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.312821 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-config-data\") pod \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\" (UID: \"6175472a-2fd6-4b07-bcb1-4e441a4587aa\") " Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.313000 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41216a8d-32f8-4ec6-ab65-5474453cad03-logs" (OuterVolumeSpecName: "logs") pod "41216a8d-32f8-4ec6-ab65-5474453cad03" (UID: "41216a8d-32f8-4ec6-ab65-5474453cad03"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.313541 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41216a8d-32f8-4ec6-ab65-5474453cad03-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.328303 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6175472a-2fd6-4b07-bcb1-4e441a4587aa" (UID: "6175472a-2fd6-4b07-bcb1-4e441a4587aa"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.328362 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41216a8d-32f8-4ec6-ab65-5474453cad03-kube-api-access-kmgm9" (OuterVolumeSpecName: "kube-api-access-kmgm9") pod "41216a8d-32f8-4ec6-ab65-5474453cad03" (UID: "41216a8d-32f8-4ec6-ab65-5474453cad03"). InnerVolumeSpecName "kube-api-access-kmgm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.328364 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-scripts" (OuterVolumeSpecName: "scripts") pod "6175472a-2fd6-4b07-bcb1-4e441a4587aa" (UID: "6175472a-2fd6-4b07-bcb1-4e441a4587aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.328380 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6175472a-2fd6-4b07-bcb1-4e441a4587aa" (UID: "6175472a-2fd6-4b07-bcb1-4e441a4587aa"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.328401 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6175472a-2fd6-4b07-bcb1-4e441a4587aa-kube-api-access-kdxx8" (OuterVolumeSpecName: "kube-api-access-kdxx8") pod "6175472a-2fd6-4b07-bcb1-4e441a4587aa" (UID: "6175472a-2fd6-4b07-bcb1-4e441a4587aa"). InnerVolumeSpecName "kube-api-access-kdxx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.328364 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-scripts" (OuterVolumeSpecName: "scripts") pod "41216a8d-32f8-4ec6-ab65-5474453cad03" (UID: "41216a8d-32f8-4ec6-ab65-5474453cad03"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.339590 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-config-data" (OuterVolumeSpecName: "config-data") pod "6175472a-2fd6-4b07-bcb1-4e441a4587aa" (UID: "6175472a-2fd6-4b07-bcb1-4e441a4587aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.344864 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-config-data" (OuterVolumeSpecName: "config-data") pod "41216a8d-32f8-4ec6-ab65-5474453cad03" (UID: "41216a8d-32f8-4ec6-ab65-5474453cad03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.345455 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41216a8d-32f8-4ec6-ab65-5474453cad03" (UID: "41216a8d-32f8-4ec6-ab65-5474453cad03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.345912 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6175472a-2fd6-4b07-bcb1-4e441a4587aa" (UID: "6175472a-2fd6-4b07-bcb1-4e441a4587aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.414833 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmgm9\" (UniqueName: \"kubernetes.io/projected/41216a8d-32f8-4ec6-ab65-5474453cad03-kube-api-access-kmgm9\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.414959 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.415030 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.415089 4722 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.415142 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.415211 4722 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.415261 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.415312 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdxx8\" (UniqueName: \"kubernetes.io/projected/6175472a-2fd6-4b07-bcb1-4e441a4587aa-kube-api-access-kdxx8\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.415376 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6175472a-2fd6-4b07-bcb1-4e441a4587aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.415436 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41216a8d-32f8-4ec6-ab65-5474453cad03-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.540442 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f6zx8" event={"ID":"6175472a-2fd6-4b07-bcb1-4e441a4587aa","Type":"ContainerDied","Data":"11171954e52f156eda5afe4276f6db2ca22ce2d1145e589fdb8708cc26950049"} Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.540505 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11171954e52f156eda5afe4276f6db2ca22ce2d1145e589fdb8708cc26950049" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.540609 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f6zx8" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.552371 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zrwzj" event={"ID":"41216a8d-32f8-4ec6-ab65-5474453cad03","Type":"ContainerDied","Data":"459f00c8be5fc495863103c5ddaf3ba201cb2953a4237a8efa971c34be8b5a94"} Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.553193 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="459f00c8be5fc495863103c5ddaf3ba201cb2953a4237a8efa971c34be8b5a94" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.552402 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zrwzj" Feb 19 19:38:50 crc kubenswrapper[4722]: I0219 19:38:50.555299 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16","Type":"ContainerStarted","Data":"5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd"} Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.243446 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7cb5f76f4-hx5jh"] Feb 19 19:38:51 crc kubenswrapper[4722]: E0219 19:38:51.244099 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41216a8d-32f8-4ec6-ab65-5474453cad03" containerName="placement-db-sync" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.244112 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="41216a8d-32f8-4ec6-ab65-5474453cad03" containerName="placement-db-sync" Feb 19 19:38:51 crc kubenswrapper[4722]: E0219 19:38:51.244124 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6175472a-2fd6-4b07-bcb1-4e441a4587aa" containerName="keystone-bootstrap" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.244130 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6175472a-2fd6-4b07-bcb1-4e441a4587aa" containerName="keystone-bootstrap" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.244368 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="41216a8d-32f8-4ec6-ab65-5474453cad03" containerName="placement-db-sync" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.244381 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6175472a-2fd6-4b07-bcb1-4e441a4587aa" containerName="keystone-bootstrap" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.245099 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.259029 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7cb5f76f4-hx5jh"] Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.259335 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.259558 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.259674 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.259773 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qhj8b" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.259958 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.260121 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.338011 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-combined-ca-bundle\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.338058 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-fernet-keys\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.338176 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-credential-keys\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.338206 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-internal-tls-certs\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.338233 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-scripts\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.338264 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-config-data\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.338278 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-public-tls-certs\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.338313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6gvk\" (UniqueName: \"kubernetes.io/projected/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-kube-api-access-c6gvk\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.362923 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7cc6894556-2r5j6"] Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.380121 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7cc6894556-2r5j6"] Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.380304 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.385643 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.385870 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.385989 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7xcck" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.386109 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.386624 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439557 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x57pr\" (UniqueName: \"kubernetes.io/projected/e0e1ecfc-6394-4815-bf10-7623a5359525-kube-api-access-x57pr\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439615 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-internal-tls-certs\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439645 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-scripts\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439680 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-credential-keys\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439709 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-internal-tls-certs\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439724 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e1ecfc-6394-4815-bf10-7623a5359525-logs\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439754 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-scripts\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-config-data\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439801 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-public-tls-certs\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439833 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6gvk\" (UniqueName: \"kubernetes.io/projected/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-kube-api-access-c6gvk\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439856 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-public-tls-certs\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439880 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-combined-ca-bundle\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439900 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-fernet-keys\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439943 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-config-data\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.439963 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-combined-ca-bundle\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.444855 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-credential-keys\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.445142 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-scripts\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.446268 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-combined-ca-bundle\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.453737 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-internal-tls-certs\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.455466 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-fernet-keys\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.460263 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-config-data\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.462757 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-public-tls-certs\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.463722 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6gvk\" (UniqueName: \"kubernetes.io/projected/32b3c2bb-2288-4e2e-a9c6-d19cfe651181-kube-api-access-c6gvk\") pod \"keystone-7cb5f76f4-hx5jh\" (UID: \"32b3c2bb-2288-4e2e-a9c6-d19cfe651181\") " pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.541903 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-public-tls-certs\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.542186 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-config-data\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.542297 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-combined-ca-bundle\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.542834 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x57pr\" (UniqueName: \"kubernetes.io/projected/e0e1ecfc-6394-4815-bf10-7623a5359525-kube-api-access-x57pr\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.544197 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-internal-tls-certs\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.544585 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-scripts\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.544769 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e1ecfc-6394-4815-bf10-7623a5359525-logs\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.545281 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e1ecfc-6394-4815-bf10-7623a5359525-logs\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.545689 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-public-tls-certs\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.547798 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-scripts\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.548043 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-internal-tls-certs\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.548515 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-config-data\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.548521 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-combined-ca-bundle\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.562789 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x57pr\" (UniqueName: \"kubernetes.io/projected/e0e1ecfc-6394-4815-bf10-7623a5359525-kube-api-access-x57pr\") pod \"placement-7cc6894556-2r5j6\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.565635 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lnf5k" event={"ID":"9c2453a9-4c81-4256-b52d-edb69c12c7d7","Type":"ContainerStarted","Data":"30471834ccd229c96e079cf27c896a4ce03111bf3efa26fc347d5a87d8bb97cd"} Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.567847 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nldcm" event={"ID":"512a4c5e-3ea6-42a8-9f83-8c0e5375891d","Type":"ContainerStarted","Data":"fe4925460ebe652124a5ffa51ecf1f233c20847811e9da501b19b829671482b6"} Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.595180 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-lnf5k" podStartSLOduration=3.6166137149999997 podStartE2EDuration="49.595142154s" podCreationTimestamp="2026-02-19 19:38:02 +0000 UTC" firstStartedPulling="2026-02-19 19:38:04.623324989 +0000 UTC m=+1184.235675313" lastFinishedPulling="2026-02-19 19:38:50.601853428 +0000 UTC m=+1230.214203752" observedRunningTime="2026-02-19 19:38:51.590334045 +0000 UTC m=+1231.202684379" watchObservedRunningTime="2026-02-19 19:38:51.595142154 +0000 UTC m=+1231.207492478" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.601797 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.625666 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-nldcm" podStartSLOduration=3.332261608 podStartE2EDuration="49.625641044s" podCreationTimestamp="2026-02-19 19:38:02 +0000 UTC" firstStartedPulling="2026-02-19 19:38:04.304398955 +0000 UTC m=+1183.916749279" lastFinishedPulling="2026-02-19 19:38:50.597778391 +0000 UTC m=+1230.210128715" observedRunningTime="2026-02-19 19:38:51.617125659 +0000 UTC m=+1231.229475983" watchObservedRunningTime="2026-02-19 19:38:51.625641044 +0000 UTC m=+1231.237991368" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.674944 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7cc7c8879d-tnbfs"] Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.676642 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.686744 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7cc7c8879d-tnbfs"] Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.705729 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.749034 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-internal-tls-certs\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.749122 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hzcg\" (UniqueName: \"kubernetes.io/projected/41b669ab-d733-4941-b134-b9ad19b38143-kube-api-access-8hzcg\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.749168 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-combined-ca-bundle\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.749209 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-scripts\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.749268 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41b669ab-d733-4941-b134-b9ad19b38143-logs\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.749312 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-public-tls-certs\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.749345 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-config-data\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.784045 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.784105 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.808094 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.808465 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.850883 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.851037 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.858075 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hzcg\" (UniqueName: \"kubernetes.io/projected/41b669ab-d733-4941-b134-b9ad19b38143-kube-api-access-8hzcg\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.859530 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-combined-ca-bundle\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.860195 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-scripts\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.860651 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41b669ab-d733-4941-b134-b9ad19b38143-logs\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.860723 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-public-tls-certs\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.860759 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-config-data\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.860918 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-internal-tls-certs\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.861889 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.862634 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41b669ab-d733-4941-b134-b9ad19b38143-logs\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.864506 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-combined-ca-bundle\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.864948 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-scripts\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.865860 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-internal-tls-certs\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.866920 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-public-tls-certs\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.866973 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41b669ab-d733-4941-b134-b9ad19b38143-config-data\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.885663 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 19:38:51 crc kubenswrapper[4722]: I0219 19:38:51.897667 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hzcg\" (UniqueName: \"kubernetes.io/projected/41b669ab-d733-4941-b134-b9ad19b38143-kube-api-access-8hzcg\") pod \"placement-7cc7c8879d-tnbfs\" (UID: \"41b669ab-d733-4941-b134-b9ad19b38143\") " pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.040876 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.208691 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7cb5f76f4-hx5jh"] Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.363838 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7cc6894556-2r5j6"] Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.580971 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7cb5f76f4-hx5jh" event={"ID":"32b3c2bb-2288-4e2e-a9c6-d19cfe651181","Type":"ContainerStarted","Data":"824de0511e4184bceffeffe595fe857a7445d6d50c5df6ccff862b78774504e1"} Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.581026 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7cb5f76f4-hx5jh" event={"ID":"32b3c2bb-2288-4e2e-a9c6-d19cfe651181","Type":"ContainerStarted","Data":"324f78b297395cee71e55b72591f8e7896ccf64ced6e26562e5702f64b3dffd4"} Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.581609 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.583409 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cc6894556-2r5j6" event={"ID":"e0e1ecfc-6394-4815-bf10-7623a5359525","Type":"ContainerStarted","Data":"c688d661fba42ba2a53e010b04af9f22dbacb7137f02c088f90b0645fc7ab228"} Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.584102 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.584352 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.584370 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.584380 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.669138 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7cb5f76f4-hx5jh" podStartSLOduration=1.669118873 podStartE2EDuration="1.669118873s" podCreationTimestamp="2026-02-19 19:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:52.610705745 +0000 UTC m=+1232.223056069" watchObservedRunningTime="2026-02-19 19:38:52.669118873 +0000 UTC m=+1232.281469197" Feb 19 19:38:52 crc kubenswrapper[4722]: I0219 19:38:52.685294 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7cc7c8879d-tnbfs"] Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.275302 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.354891 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-587r4"] Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.355239 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" podUID="0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" containerName="dnsmasq-dns" containerID="cri-o://ad4c618017bbd4becca3e0b0113a9facbc19857b41b1b6a0185965e3fe42e985" gracePeriod=10 Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.645114 4722 generic.go:334] "Generic (PLEG): container finished" podID="0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" containerID="ad4c618017bbd4becca3e0b0113a9facbc19857b41b1b6a0185965e3fe42e985" exitCode=0 Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.645482 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" event={"ID":"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0","Type":"ContainerDied","Data":"ad4c618017bbd4becca3e0b0113a9facbc19857b41b1b6a0185965e3fe42e985"} Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.677743 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cc6894556-2r5j6" event={"ID":"e0e1ecfc-6394-4815-bf10-7623a5359525","Type":"ContainerStarted","Data":"95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d"} Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.677809 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cc6894556-2r5j6" event={"ID":"e0e1ecfc-6394-4815-bf10-7623a5359525","Type":"ContainerStarted","Data":"c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df"} Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.677855 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.677907 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.684141 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cc7c8879d-tnbfs" event={"ID":"41b669ab-d733-4941-b134-b9ad19b38143","Type":"ContainerStarted","Data":"f51bf694020dd5e39c5b1ce070330d37b6eaff405cc0ba31780c5a96b0a35ded"} Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.684214 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cc7c8879d-tnbfs" event={"ID":"41b669ab-d733-4941-b134-b9ad19b38143","Type":"ContainerStarted","Data":"8ec749216293740638000f28d158c0105a2cad692d91aa2514c37dfb1a5704af"} Feb 19 19:38:53 crc kubenswrapper[4722]: I0219 19:38:53.736422 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7cc6894556-2r5j6" podStartSLOduration=2.736400162 podStartE2EDuration="2.736400162s" podCreationTimestamp="2026-02-19 19:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:53.700725821 +0000 UTC m=+1233.313076155" watchObservedRunningTime="2026-02-19 19:38:53.736400162 +0000 UTC m=+1233.348750496" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.320036 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.427184 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-swift-storage-0\") pod \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.427733 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65hgp\" (UniqueName: \"kubernetes.io/projected/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-kube-api-access-65hgp\") pod \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.427776 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-nb\") pod \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.427804 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-svc\") pod \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.427873 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-config\") pod \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.428038 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-sb\") pod \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\" (UID: \"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0\") " Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.455399 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-kube-api-access-65hgp" (OuterVolumeSpecName: "kube-api-access-65hgp") pod "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" (UID: "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0"). InnerVolumeSpecName "kube-api-access-65hgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.486963 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" (UID: "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.496101 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" (UID: "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.530702 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65hgp\" (UniqueName: \"kubernetes.io/projected/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-kube-api-access-65hgp\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.530742 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.530759 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.531167 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-config" (OuterVolumeSpecName: "config") pod "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" (UID: "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.532057 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" (UID: "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.595561 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" (UID: "0396bebf-310d-43c3-a0f5-e8cddf9c3cb0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.634533 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.634578 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.634588 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.713225 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.713244 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.714110 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.720430 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-587r4" event={"ID":"0396bebf-310d-43c3-a0f5-e8cddf9c3cb0","Type":"ContainerDied","Data":"ccc22dda92d98641022846e698ef973d9b21c55e7af354f095475756126633bf"} Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.720501 4722 scope.go:117] "RemoveContainer" containerID="ad4c618017bbd4becca3e0b0113a9facbc19857b41b1b6a0185965e3fe42e985" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.766208 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-587r4"] Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.776098 4722 scope.go:117] "RemoveContainer" containerID="e6f983685272a18f6384c33405c42fd7cac9d9c7919a092034ad166f31cb8a76" Feb 19 19:38:54 crc kubenswrapper[4722]: I0219 19:38:54.779187 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-587r4"] Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.083654 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" path="/var/lib/kubelet/pods/0396bebf-310d-43c3-a0f5-e8cddf9c3cb0/volumes" Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.725833 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-xdgs2" event={"ID":"fb399ce1-7269-4d99-9140-0d1d33a6fd6a","Type":"ContainerStarted","Data":"8fba7a7dd2b4b36b32712f1263954190cba9206e6fe4eb845c3663a36d4748db"} Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.736612 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cc7c8879d-tnbfs" event={"ID":"41b669ab-d733-4941-b134-b9ad19b38143","Type":"ContainerStarted","Data":"45398e9194cf34a912d856e8546e54bf77cb6213b237928b64ffef777fb10ae1"} Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.736649 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.736669 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.756445 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-xdgs2" podStartSLOduration=4.810693082 podStartE2EDuration="53.756426917s" podCreationTimestamp="2026-02-19 19:38:02 +0000 UTC" firstStartedPulling="2026-02-19 19:38:04.77728589 +0000 UTC m=+1184.389636214" lastFinishedPulling="2026-02-19 19:38:53.723019725 +0000 UTC m=+1233.335370049" observedRunningTime="2026-02-19 19:38:55.750723479 +0000 UTC m=+1235.363073813" watchObservedRunningTime="2026-02-19 19:38:55.756426917 +0000 UTC m=+1235.368777241" Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.782907 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7cc7c8879d-tnbfs" podStartSLOduration=4.78288841 podStartE2EDuration="4.78288841s" podCreationTimestamp="2026-02-19 19:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:38:55.776498041 +0000 UTC m=+1235.388848365" watchObservedRunningTime="2026-02-19 19:38:55.78288841 +0000 UTC m=+1235.395238735" Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.967044 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.967186 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.973529 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.979481 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 19:38:55 crc kubenswrapper[4722]: I0219 19:38:55.979602 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 19:38:56 crc kubenswrapper[4722]: I0219 19:38:56.187683 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 19:38:56 crc kubenswrapper[4722]: I0219 19:38:56.750675 4722 generic.go:334] "Generic (PLEG): container finished" podID="9c2453a9-4c81-4256-b52d-edb69c12c7d7" containerID="30471834ccd229c96e079cf27c896a4ce03111bf3efa26fc347d5a87d8bb97cd" exitCode=0 Feb 19 19:38:56 crc kubenswrapper[4722]: I0219 19:38:56.750745 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lnf5k" event={"ID":"9c2453a9-4c81-4256-b52d-edb69c12c7d7","Type":"ContainerDied","Data":"30471834ccd229c96e079cf27c896a4ce03111bf3efa26fc347d5a87d8bb97cd"} Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.443297 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.556579 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvk66\" (UniqueName: \"kubernetes.io/projected/9c2453a9-4c81-4256-b52d-edb69c12c7d7-kube-api-access-dvk66\") pod \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.556670 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-db-sync-config-data\") pod \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.556853 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-combined-ca-bundle\") pod \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\" (UID: \"9c2453a9-4c81-4256-b52d-edb69c12c7d7\") " Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.562427 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9c2453a9-4c81-4256-b52d-edb69c12c7d7" (UID: "9c2453a9-4c81-4256-b52d-edb69c12c7d7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.562486 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c2453a9-4c81-4256-b52d-edb69c12c7d7-kube-api-access-dvk66" (OuterVolumeSpecName: "kube-api-access-dvk66") pod "9c2453a9-4c81-4256-b52d-edb69c12c7d7" (UID: "9c2453a9-4c81-4256-b52d-edb69c12c7d7"). InnerVolumeSpecName "kube-api-access-dvk66". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.587878 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c2453a9-4c81-4256-b52d-edb69c12c7d7" (UID: "9c2453a9-4c81-4256-b52d-edb69c12c7d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.659096 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.659145 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvk66\" (UniqueName: \"kubernetes.io/projected/9c2453a9-4c81-4256-b52d-edb69c12c7d7-kube-api-access-dvk66\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.659202 4722 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9c2453a9-4c81-4256-b52d-edb69c12c7d7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.788910 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16","Type":"ContainerStarted","Data":"81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c"} Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.789009 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="ceilometer-central-agent" containerID="cri-o://6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14" gracePeriod=30 Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.789058 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.789097 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="ceilometer-notification-agent" containerID="cri-o://891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351" gracePeriod=30 Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.789259 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="sg-core" containerID="cri-o://5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd" gracePeriod=30 Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.789450 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="proxy-httpd" containerID="cri-o://81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c" gracePeriod=30 Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.792250 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lnf5k" event={"ID":"9c2453a9-4c81-4256-b52d-edb69c12c7d7","Type":"ContainerDied","Data":"c55c99500a8dc3a393a869149de80e388347c4c52dbc3f1981dc5cba2b917f9a"} Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.792284 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c55c99500a8dc3a393a869149de80e388347c4c52dbc3f1981dc5cba2b917f9a" Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.792343 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lnf5k" Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.798839 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nldcm" event={"ID":"512a4c5e-3ea6-42a8-9f83-8c0e5375891d","Type":"ContainerDied","Data":"fe4925460ebe652124a5ffa51ecf1f233c20847811e9da501b19b829671482b6"} Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.798999 4722 generic.go:334] "Generic (PLEG): container finished" podID="512a4c5e-3ea6-42a8-9f83-8c0e5375891d" containerID="fe4925460ebe652124a5ffa51ecf1f233c20847811e9da501b19b829671482b6" exitCode=0 Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.801338 4722 generic.go:334] "Generic (PLEG): container finished" podID="fb399ce1-7269-4d99-9140-0d1d33a6fd6a" containerID="8fba7a7dd2b4b36b32712f1263954190cba9206e6fe4eb845c3663a36d4748db" exitCode=0 Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.801390 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-xdgs2" event={"ID":"fb399ce1-7269-4d99-9140-0d1d33a6fd6a","Type":"ContainerDied","Data":"8fba7a7dd2b4b36b32712f1263954190cba9206e6fe4eb845c3663a36d4748db"} Feb 19 19:39:00 crc kubenswrapper[4722]: I0219 19:39:00.815805 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.294954117 podStartE2EDuration="58.815782673s" podCreationTimestamp="2026-02-19 19:38:02 +0000 UTC" firstStartedPulling="2026-02-19 19:38:04.777366602 +0000 UTC m=+1184.389716916" lastFinishedPulling="2026-02-19 19:39:00.298195148 +0000 UTC m=+1239.910545472" observedRunningTime="2026-02-19 19:39:00.811976744 +0000 UTC m=+1240.424327078" watchObservedRunningTime="2026-02-19 19:39:00.815782673 +0000 UTC m=+1240.428133007" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.814058 4722 generic.go:334] "Generic (PLEG): container finished" podID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerID="81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c" exitCode=0 Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.814461 4722 generic.go:334] "Generic (PLEG): container finished" podID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerID="5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd" exitCode=2 Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.814475 4722 generic.go:334] "Generic (PLEG): container finished" podID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerID="6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14" exitCode=0 Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.814127 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16","Type":"ContainerDied","Data":"81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c"} Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.814566 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16","Type":"ContainerDied","Data":"5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd"} Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.814579 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16","Type":"ContainerDied","Data":"6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14"} Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.864057 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6767bd5ccf-ggbrg"] Feb 19 19:39:01 crc kubenswrapper[4722]: E0219 19:39:01.864477 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" containerName="init" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.864490 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" containerName="init" Feb 19 19:39:01 crc kubenswrapper[4722]: E0219 19:39:01.864500 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" containerName="dnsmasq-dns" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.864505 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" containerName="dnsmasq-dns" Feb 19 19:39:01 crc kubenswrapper[4722]: E0219 19:39:01.864520 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c2453a9-4c81-4256-b52d-edb69c12c7d7" containerName="barbican-db-sync" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.864529 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c2453a9-4c81-4256-b52d-edb69c12c7d7" containerName="barbican-db-sync" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.864701 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c2453a9-4c81-4256-b52d-edb69c12c7d7" containerName="barbican-db-sync" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.864720 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0396bebf-310d-43c3-a0f5-e8cddf9c3cb0" containerName="dnsmasq-dns" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.870424 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.881361 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.881707 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.888927 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-tj2ww" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.902005 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6767bd5ccf-ggbrg"] Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.964251 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-98b54b474-9tfhf"] Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.976254 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.979904 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.986576 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66f5042d-2b30-4ac4-8594-cfc0f9590460-config-data-custom\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.986938 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66f5042d-2b30-4ac4-8594-cfc0f9590460-logs\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.986992 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66f5042d-2b30-4ac4-8594-cfc0f9590460-config-data\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.987099 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66f5042d-2b30-4ac4-8594-cfc0f9590460-combined-ca-bundle\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:01 crc kubenswrapper[4722]: I0219 19:39:01.987183 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc24m\" (UniqueName: \"kubernetes.io/projected/66f5042d-2b30-4ac4-8594-cfc0f9590460-kube-api-access-zc24m\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.043548 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-98b54b474-9tfhf"] Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.074897 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tvbws"] Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.079180 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.088767 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96ffdf9d-f932-419b-be31-9f38358d2db5-config-data\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.088862 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ffdf9d-f932-419b-be31-9f38358d2db5-combined-ca-bundle\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.088891 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96ffdf9d-f932-419b-be31-9f38358d2db5-logs\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.088918 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc8zg\" (UniqueName: \"kubernetes.io/projected/96ffdf9d-f932-419b-be31-9f38358d2db5-kube-api-access-tc8zg\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.089028 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66f5042d-2b30-4ac4-8594-cfc0f9590460-logs\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.089135 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66f5042d-2b30-4ac4-8594-cfc0f9590460-config-data\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.089489 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66f5042d-2b30-4ac4-8594-cfc0f9590460-logs\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.090117 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96ffdf9d-f932-419b-be31-9f38358d2db5-config-data-custom\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.090262 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66f5042d-2b30-4ac4-8594-cfc0f9590460-combined-ca-bundle\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.100371 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66f5042d-2b30-4ac4-8594-cfc0f9590460-config-data\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.102970 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc24m\" (UniqueName: \"kubernetes.io/projected/66f5042d-2b30-4ac4-8594-cfc0f9590460-kube-api-access-zc24m\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.103098 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66f5042d-2b30-4ac4-8594-cfc0f9590460-config-data-custom\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.104325 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tvbws"] Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.105739 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66f5042d-2b30-4ac4-8594-cfc0f9590460-combined-ca-bundle\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.108750 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66f5042d-2b30-4ac4-8594-cfc0f9590460-config-data-custom\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.117849 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc24m\" (UniqueName: \"kubernetes.io/projected/66f5042d-2b30-4ac4-8594-cfc0f9590460-kube-api-access-zc24m\") pod \"barbican-worker-6767bd5ccf-ggbrg\" (UID: \"66f5042d-2b30-4ac4-8594-cfc0f9590460\") " pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.191201 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-59d6bc9fcb-2t849"] Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.192936 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.198537 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204317 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ffdf9d-f932-419b-be31-9f38358d2db5-combined-ca-bundle\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204352 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96ffdf9d-f932-419b-be31-9f38358d2db5-logs\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204372 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc8zg\" (UniqueName: \"kubernetes.io/projected/96ffdf9d-f932-419b-be31-9f38358d2db5-kube-api-access-tc8zg\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204395 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-config\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204428 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204460 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204498 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-svc\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204553 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96ffdf9d-f932-419b-be31-9f38358d2db5-config-data-custom\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204573 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204624 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxrcx\" (UniqueName: \"kubernetes.io/projected/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-kube-api-access-gxrcx\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.204684 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96ffdf9d-f932-419b-be31-9f38358d2db5-config-data\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.205983 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96ffdf9d-f932-419b-be31-9f38358d2db5-logs\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.211452 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96ffdf9d-f932-419b-be31-9f38358d2db5-config-data-custom\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.216676 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96ffdf9d-f932-419b-be31-9f38358d2db5-combined-ca-bundle\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.218255 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59d6bc9fcb-2t849"] Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.220760 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6767bd5ccf-ggbrg" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.228927 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc8zg\" (UniqueName: \"kubernetes.io/projected/96ffdf9d-f932-419b-be31-9f38358d2db5-kube-api-access-tc8zg\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.229514 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96ffdf9d-f932-419b-be31-9f38358d2db5-config-data\") pod \"barbican-keystone-listener-98b54b474-9tfhf\" (UID: \"96ffdf9d-f932-419b-be31-9f38358d2db5\") " pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.306901 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-svc\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.307506 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-combined-ca-bundle\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.307621 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.307716 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.307867 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxrcx\" (UniqueName: \"kubernetes.io/projected/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-kube-api-access-gxrcx\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.307977 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data-custom\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.307896 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-svc\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.308098 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da70c61d-7b82-48ee-bce0-53e96df3442d-logs\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.308346 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-config\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.308440 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.308508 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.308547 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlq7v\" (UniqueName: \"kubernetes.io/projected/da70c61d-7b82-48ee-bce0-53e96df3442d-kube-api-access-tlq7v\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.309040 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.309618 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.310020 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.310187 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-config\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.322688 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.328080 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxrcx\" (UniqueName: \"kubernetes.io/projected/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-kube-api-access-gxrcx\") pod \"dnsmasq-dns-85ff748b95-tvbws\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.410994 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlq7v\" (UniqueName: \"kubernetes.io/projected/da70c61d-7b82-48ee-bce0-53e96df3442d-kube-api-access-tlq7v\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.411067 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-combined-ca-bundle\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.411088 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.411143 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data-custom\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.411201 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da70c61d-7b82-48ee-bce0-53e96df3442d-logs\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.414803 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da70c61d-7b82-48ee-bce0-53e96df3442d-logs\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.422009 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-combined-ca-bundle\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.424809 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.431613 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data-custom\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.450173 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlq7v\" (UniqueName: \"kubernetes.io/projected/da70c61d-7b82-48ee-bce0-53e96df3442d-kube-api-access-tlq7v\") pod \"barbican-api-59d6bc9fcb-2t849\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.518104 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.530590 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nldcm" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.532577 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618099 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-scripts\") pod \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618183 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-config-data\") pod \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618209 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-combined-ca-bundle\") pod \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618269 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-combined-ca-bundle\") pod \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618302 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7zht\" (UniqueName: \"kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-kube-api-access-l7zht\") pod \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618374 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-db-sync-config-data\") pod \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618442 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-etc-machine-id\") pod \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618485 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdjl8\" (UniqueName: \"kubernetes.io/projected/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-kube-api-access-cdjl8\") pod \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\" (UID: \"512a4c5e-3ea6-42a8-9f83-8c0e5375891d\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618522 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-certs\") pod \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618572 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-config-data\") pod \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.618653 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-scripts\") pod \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\" (UID: \"fb399ce1-7269-4d99-9140-0d1d33a6fd6a\") " Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.622485 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "512a4c5e-3ea6-42a8-9f83-8c0e5375891d" (UID: "512a4c5e-3ea6-42a8-9f83-8c0e5375891d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.625127 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-scripts" (OuterVolumeSpecName: "scripts") pod "fb399ce1-7269-4d99-9140-0d1d33a6fd6a" (UID: "fb399ce1-7269-4d99-9140-0d1d33a6fd6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.625237 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-kube-api-access-l7zht" (OuterVolumeSpecName: "kube-api-access-l7zht") pod "fb399ce1-7269-4d99-9140-0d1d33a6fd6a" (UID: "fb399ce1-7269-4d99-9140-0d1d33a6fd6a"). InnerVolumeSpecName "kube-api-access-l7zht". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.629411 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "512a4c5e-3ea6-42a8-9f83-8c0e5375891d" (UID: "512a4c5e-3ea6-42a8-9f83-8c0e5375891d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.633554 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-certs" (OuterVolumeSpecName: "certs") pod "fb399ce1-7269-4d99-9140-0d1d33a6fd6a" (UID: "fb399ce1-7269-4d99-9140-0d1d33a6fd6a"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.635683 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-kube-api-access-cdjl8" (OuterVolumeSpecName: "kube-api-access-cdjl8") pod "512a4c5e-3ea6-42a8-9f83-8c0e5375891d" (UID: "512a4c5e-3ea6-42a8-9f83-8c0e5375891d"). InnerVolumeSpecName "kube-api-access-cdjl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.639114 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-scripts" (OuterVolumeSpecName: "scripts") pod "512a4c5e-3ea6-42a8-9f83-8c0e5375891d" (UID: "512a4c5e-3ea6-42a8-9f83-8c0e5375891d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.661966 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "512a4c5e-3ea6-42a8-9f83-8c0e5375891d" (UID: "512a4c5e-3ea6-42a8-9f83-8c0e5375891d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.663865 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-config-data" (OuterVolumeSpecName: "config-data") pod "fb399ce1-7269-4d99-9140-0d1d33a6fd6a" (UID: "fb399ce1-7269-4d99-9140-0d1d33a6fd6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.674975 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb399ce1-7269-4d99-9140-0d1d33a6fd6a" (UID: "fb399ce1-7269-4d99-9140-0d1d33a6fd6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.694644 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-config-data" (OuterVolumeSpecName: "config-data") pod "512a4c5e-3ea6-42a8-9f83-8c0e5375891d" (UID: "512a4c5e-3ea6-42a8-9f83-8c0e5375891d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.713707 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726728 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726762 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726779 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726792 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726801 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7zht\" (UniqueName: \"kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-kube-api-access-l7zht\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726810 4722 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726821 4722 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726829 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdjl8\" (UniqueName: \"kubernetes.io/projected/512a4c5e-3ea6-42a8-9f83-8c0e5375891d-kube-api-access-cdjl8\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726837 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726844 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.726852 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb399ce1-7269-4d99-9140-0d1d33a6fd6a-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.752209 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6767bd5ccf-ggbrg"] Feb 19 19:39:02 crc kubenswrapper[4722]: W0219 19:39:02.753614 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66f5042d_2b30_4ac4_8594_cfc0f9590460.slice/crio-f51f59c70d24d367d5906968a62374321e15df67ee9ff9801d023449e3c7f8f8 WatchSource:0}: Error finding container f51f59c70d24d367d5906968a62374321e15df67ee9ff9801d023449e3c7f8f8: Status 404 returned error can't find the container with id f51f59c70d24d367d5906968a62374321e15df67ee9ff9801d023449e3c7f8f8 Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.831350 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nldcm" event={"ID":"512a4c5e-3ea6-42a8-9f83-8c0e5375891d","Type":"ContainerDied","Data":"65597a01a3e59b230c7526b664301c7f8fdd9e898558a558f3adbb4bcd59ec0f"} Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.831401 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65597a01a3e59b230c7526b664301c7f8fdd9e898558a558f3adbb4bcd59ec0f" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.831368 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nldcm" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.834906 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6767bd5ccf-ggbrg" event={"ID":"66f5042d-2b30-4ac4-8594-cfc0f9590460","Type":"ContainerStarted","Data":"f51f59c70d24d367d5906968a62374321e15df67ee9ff9801d023449e3c7f8f8"} Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.839378 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-xdgs2" event={"ID":"fb399ce1-7269-4d99-9140-0d1d33a6fd6a","Type":"ContainerDied","Data":"b485d2ccfdc9766193d0fa763ea0b9af82b812effcaae62a566a8b1ce25316b5"} Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.839414 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b485d2ccfdc9766193d0fa763ea0b9af82b812effcaae62a566a8b1ce25316b5" Feb 19 19:39:02 crc kubenswrapper[4722]: I0219 19:39:02.839479 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-xdgs2" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.013161 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-98b54b474-9tfhf"] Feb 19 19:39:03 crc kubenswrapper[4722]: W0219 19:39:03.016602 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96ffdf9d_f932_419b_be31_9f38358d2db5.slice/crio-210165131bd7b9922fa0264c7a81958a2f4d709a2d8b89d68e01bbfa148a73c2 WatchSource:0}: Error finding container 210165131bd7b9922fa0264c7a81958a2f4d709a2d8b89d68e01bbfa148a73c2: Status 404 returned error can't find the container with id 210165131bd7b9922fa0264c7a81958a2f4d709a2d8b89d68e01bbfa148a73c2 Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.060882 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-86mtg"] Feb 19 19:39:03 crc kubenswrapper[4722]: E0219 19:39:03.061328 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb399ce1-7269-4d99-9140-0d1d33a6fd6a" containerName="cloudkitty-db-sync" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.061344 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb399ce1-7269-4d99-9140-0d1d33a6fd6a" containerName="cloudkitty-db-sync" Feb 19 19:39:03 crc kubenswrapper[4722]: E0219 19:39:03.061366 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512a4c5e-3ea6-42a8-9f83-8c0e5375891d" containerName="cinder-db-sync" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.061372 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="512a4c5e-3ea6-42a8-9f83-8c0e5375891d" containerName="cinder-db-sync" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.061550 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb399ce1-7269-4d99-9140-0d1d33a6fd6a" containerName="cloudkitty-db-sync" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.061579 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="512a4c5e-3ea6-42a8-9f83-8c0e5375891d" containerName="cinder-db-sync" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.062288 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.066866 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.066914 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.067024 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-bnkq4" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.074459 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.074695 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.102109 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-86mtg"] Feb 19 19:39:03 crc kubenswrapper[4722]: W0219 19:39:03.128738 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5ef6823_2e42_41cf_8eda_f9ea51c8c6f5.slice/crio-c3f9970d8e6df7b907c93856f4d4566a3aeee56e759df554fd7046ca7d3df35d WatchSource:0}: Error finding container c3f9970d8e6df7b907c93856f4d4566a3aeee56e759df554fd7046ca7d3df35d: Status 404 returned error can't find the container with id c3f9970d8e6df7b907c93856f4d4566a3aeee56e759df554fd7046ca7d3df35d Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.134694 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-combined-ca-bundle\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.134884 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-scripts\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.134927 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-certs\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.134972 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw9tj\" (UniqueName: \"kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-kube-api-access-zw9tj\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.135035 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-config-data\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.136916 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tvbws"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.155861 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.158759 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.161752 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-4h658" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.162091 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.163301 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.175850 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.185085 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.236429 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tvbws"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.237691 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-scripts\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.237726 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-certs\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.237766 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw9tj\" (UniqueName: \"kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-kube-api-access-zw9tj\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.237797 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.237843 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-config-data\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.237899 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq4kd\" (UniqueName: \"kubernetes.io/projected/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-kube-api-access-xq4kd\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.237946 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.237985 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.238025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-combined-ca-bundle\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.238043 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.238101 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.242116 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-scripts\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.249451 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-config-data\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.271036 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-certs\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.282260 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw9tj\" (UniqueName: \"kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-kube-api-access-zw9tj\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.317559 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.320528 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.325114 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-combined-ca-bundle\") pod \"cloudkitty-storageinit-86mtg\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.325320 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.341860 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7hq9c"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.369036 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.407223 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59d6bc9fcb-2t849"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.413854 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq4kd\" (UniqueName: \"kubernetes.io/projected/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-kube-api-access-xq4kd\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.413928 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.414764 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.415821 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.415931 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l94k\" (UniqueName: \"kubernetes.io/projected/16a3e23c-d8b4-4030-ad8e-f12ffc069564-kube-api-access-7l94k\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.415969 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.416023 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.416100 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16a3e23c-d8b4-4030-ad8e-f12ffc069564-etc-machine-id\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.416235 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.417007 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16a3e23c-d8b4-4030-ad8e-f12ffc069564-logs\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.417050 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.417127 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.417318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.417383 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data-custom\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.417536 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-scripts\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.420891 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-scripts\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.421959 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.422439 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.422530 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.439058 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq4kd\" (UniqueName: \"kubernetes.io/projected/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-kube-api-access-xq4kd\") pod \"cinder-scheduler-0\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.439898 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.449911 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7hq9c"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.462275 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.505244 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.519658 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.519703 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.519762 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l94k\" (UniqueName: \"kubernetes.io/projected/16a3e23c-d8b4-4030-ad8e-f12ffc069564-kube-api-access-7l94k\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.519852 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16a3e23c-d8b4-4030-ad8e-f12ffc069564-etc-machine-id\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.519896 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.519916 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16a3e23c-d8b4-4030-ad8e-f12ffc069564-logs\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.519933 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.519964 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.519988 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zffz\" (UniqueName: \"kubernetes.io/projected/2e73e983-eb03-4734-838f-85a759275b7a-kube-api-access-4zffz\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.520012 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data-custom\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.520041 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-scripts\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.520060 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.520085 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-config\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.520859 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16a3e23c-d8b4-4030-ad8e-f12ffc069564-logs\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.521730 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16a3e23c-d8b4-4030-ad8e-f12ffc069564-etc-machine-id\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.528080 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-scripts\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.528754 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.529320 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.534380 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data-custom\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.540573 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l94k\" (UniqueName: \"kubernetes.io/projected/16a3e23c-d8b4-4030-ad8e-f12ffc069564-kube-api-access-7l94k\") pod \"cinder-api-0\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.558369 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.621066 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.621338 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.621378 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zffz\" (UniqueName: \"kubernetes.io/projected/2e73e983-eb03-4734-838f-85a759275b7a-kube-api-access-4zffz\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.621418 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.621444 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-config\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.621488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.622510 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.622698 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.624419 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.624590 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-config\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.625425 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.642781 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zffz\" (UniqueName: \"kubernetes.io/projected/2e73e983-eb03-4734-838f-85a759275b7a-kube-api-access-4zffz\") pod \"dnsmasq-dns-5c9776ccc5-7hq9c\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.772356 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d74fd689-q5qhb"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.772578 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-d74fd689-q5qhb" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerName="neutron-api" containerID="cri-o://e28ba51730232a08a3cd5dc96327f73be33823e4b79e43d0c66d0800f455e9e0" gracePeriod=30 Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.773000 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-d74fd689-q5qhb" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerName="neutron-httpd" containerID="cri-o://6cecb6c27a5d8a3d6ffee2f1f0d633c671295bd59fc22535a5bf9eb9959995c0" gracePeriod=30 Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.819996 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8694c7b8f7-2td8g"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.822059 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.825453 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-d74fd689-q5qhb" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.171:9696/\": EOF" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.881093 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8694c7b8f7-2td8g"] Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.885127 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.915368 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" event={"ID":"96ffdf9d-f932-419b-be31-9f38358d2db5","Type":"ContainerStarted","Data":"210165131bd7b9922fa0264c7a81958a2f4d709a2d8b89d68e01bbfa148a73c2"} Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.921411 4722 generic.go:334] "Generic (PLEG): container finished" podID="d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" containerID="3ee66e2d9f3ff70ba95788fa2041bc3ab471615c0403f4144d9bc1ae897eb89c" exitCode=0 Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.921478 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tvbws" event={"ID":"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5","Type":"ContainerDied","Data":"3ee66e2d9f3ff70ba95788fa2041bc3ab471615c0403f4144d9bc1ae897eb89c"} Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.921505 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tvbws" event={"ID":"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5","Type":"ContainerStarted","Data":"c3f9970d8e6df7b907c93856f4d4566a3aeee56e759df554fd7046ca7d3df35d"} Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.931074 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9b27\" (UniqueName: \"kubernetes.io/projected/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-kube-api-access-p9b27\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.931350 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-ovndb-tls-certs\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.931548 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-config\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.931764 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-public-tls-certs\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.932104 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-combined-ca-bundle\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.932256 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-internal-tls-certs\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.932363 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-httpd-config\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.931779 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d6bc9fcb-2t849" event={"ID":"da70c61d-7b82-48ee-bce0-53e96df3442d","Type":"ContainerStarted","Data":"deec74702da9c72730adb9e092792817648059be9727bcc7760a0aa5c428553c"} Feb 19 19:39:03 crc kubenswrapper[4722]: I0219 19:39:03.933205 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d6bc9fcb-2t849" event={"ID":"da70c61d-7b82-48ee-bce0-53e96df3442d","Type":"ContainerStarted","Data":"242a8544a68d9d74ed6eb73bcacefdebb2fd4ae624de878f66d716d08691a8be"} Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.009110 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-86mtg"] Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.034455 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9b27\" (UniqueName: \"kubernetes.io/projected/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-kube-api-access-p9b27\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.034959 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-ovndb-tls-certs\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.035011 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-config\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.035055 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-public-tls-certs\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.035098 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-combined-ca-bundle\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.035132 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-internal-tls-certs\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.035245 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-httpd-config\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.040024 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-httpd-config\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.042728 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-combined-ca-bundle\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.045283 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-ovndb-tls-certs\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.045871 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-public-tls-certs\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.049514 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-internal-tls-certs\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.053450 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-config\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.071029 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9b27\" (UniqueName: \"kubernetes.io/projected/a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b-kube-api-access-p9b27\") pod \"neutron-8694c7b8f7-2td8g\" (UID: \"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b\") " pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.159502 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.194084 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.357136 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 19:39:04 crc kubenswrapper[4722]: W0219 19:39:04.420777 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16a3e23c_d8b4_4030_ad8e_f12ffc069564.slice/crio-222656de90eea2fb87119d4565ce1d73241b41489eaee592945bf60298d04d35 WatchSource:0}: Error finding container 222656de90eea2fb87119d4565ce1d73241b41489eaee592945bf60298d04d35: Status 404 returned error can't find the container with id 222656de90eea2fb87119d4565ce1d73241b41489eaee592945bf60298d04d35 Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.624202 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7hq9c"] Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.807798 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.953723 4722 generic.go:334] "Generic (PLEG): container finished" podID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerID="6cecb6c27a5d8a3d6ffee2f1f0d633c671295bd59fc22535a5bf9eb9959995c0" exitCode=0 Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.953764 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d74fd689-q5qhb" event={"ID":"5c88f138-094d-44c0-b1c9-1492e7e11e9b","Type":"ContainerDied","Data":"6cecb6c27a5d8a3d6ffee2f1f0d633c671295bd59fc22535a5bf9eb9959995c0"} Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.956570 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16a3e23c-d8b4-4030-ad8e-f12ffc069564","Type":"ContainerStarted","Data":"222656de90eea2fb87119d4565ce1d73241b41489eaee592945bf60298d04d35"} Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.959038 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-tvbws" event={"ID":"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5","Type":"ContainerDied","Data":"c3f9970d8e6df7b907c93856f4d4566a3aeee56e759df554fd7046ca7d3df35d"} Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.959075 4722 scope.go:117] "RemoveContainer" containerID="3ee66e2d9f3ff70ba95788fa2041bc3ab471615c0403f4144d9bc1ae897eb89c" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.959077 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-tvbws" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.963965 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-86mtg" event={"ID":"1725704f-c153-4de4-9246-87c6a5e878ea","Type":"ContainerStarted","Data":"13413006ae1624571bd31498af1bfba16b06dc1ae973f9ef0d89f06ecc4ef187"} Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.964001 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-86mtg" event={"ID":"1725704f-c153-4de4-9246-87c6a5e878ea","Type":"ContainerStarted","Data":"ea7852dcadbb3212d9207882980f204f6c637ee58504de45986bee8494bbea9e"} Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.967617 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d6bc9fcb-2t849" event={"ID":"da70c61d-7b82-48ee-bce0-53e96df3442d","Type":"ContainerStarted","Data":"9f6bb518f1ba765c7a7052f429020ccd643361e2ca7e80330aa450dd36d72a26"} Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.967701 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.967748 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.975536 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" event={"ID":"2e73e983-eb03-4734-838f-85a759275b7a","Type":"ContainerStarted","Data":"39341497cf9614456c8136bec5d4742d83abccc290b3636722c56ae71d3a4127"} Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.980477 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d","Type":"ContainerStarted","Data":"7df30d9bfcde00a2b1fb449d6fdae155a98e793982f413e0d76e3453b4b0afd2"} Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.982811 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxrcx\" (UniqueName: \"kubernetes.io/projected/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-kube-api-access-gxrcx\") pod \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.982873 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-config\") pod \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.983289 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-swift-storage-0\") pod \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.983318 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-svc\") pod \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.984243 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-nb\") pod \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.984268 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-sb\") pod \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\" (UID: \"d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5\") " Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.984257 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-86mtg" podStartSLOduration=1.9842465379999998 podStartE2EDuration="1.984246538s" podCreationTimestamp="2026-02-19 19:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:04.980666476 +0000 UTC m=+1244.593016800" watchObservedRunningTime="2026-02-19 19:39:04.984246538 +0000 UTC m=+1244.596596852" Feb 19 19:39:04 crc kubenswrapper[4722]: I0219 19:39:04.987207 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-kube-api-access-gxrcx" (OuterVolumeSpecName: "kube-api-access-gxrcx") pod "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" (UID: "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5"). InnerVolumeSpecName "kube-api-access-gxrcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.023572 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-59d6bc9fcb-2t849" podStartSLOduration=3.023548681 podStartE2EDuration="3.023548681s" podCreationTimestamp="2026-02-19 19:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:05.004491498 +0000 UTC m=+1244.616841822" watchObservedRunningTime="2026-02-19 19:39:05.023548681 +0000 UTC m=+1244.635899015" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.044383 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8694c7b8f7-2td8g"] Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.051849 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-config" (OuterVolumeSpecName: "config") pod "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" (UID: "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.054548 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" (UID: "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.056895 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" (UID: "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.065529 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" (UID: "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.068464 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" (UID: "d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.086373 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.086401 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.086414 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxrcx\" (UniqueName: \"kubernetes.io/projected/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-kube-api-access-gxrcx\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.086425 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.086435 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.086444 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.430748 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tvbws"] Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.446876 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-tvbws"] Feb 19 19:39:05 crc kubenswrapper[4722]: I0219 19:39:05.859859 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:05.999949 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e73e983-eb03-4734-838f-85a759275b7a" containerID="b58a92dec7aa9fa905d85fdb92866ee986e06e5d51cc5b90911bb9db7cccb1d3" exitCode=0 Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.000048 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" event={"ID":"2e73e983-eb03-4734-838f-85a759275b7a","Type":"ContainerDied","Data":"b58a92dec7aa9fa905d85fdb92866ee986e06e5d51cc5b90911bb9db7cccb1d3"} Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.005846 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8694c7b8f7-2td8g" event={"ID":"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b","Type":"ContainerStarted","Data":"a0ab87ecf0804ea35d3ab983301b5f69fe1b71f605261c3ce9bc8489326f6346"} Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.017504 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-scripts\") pod \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.017890 4722 generic.go:334] "Generic (PLEG): container finished" podID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerID="e28ba51730232a08a3cd5dc96327f73be33823e4b79e43d0c66d0800f455e9e0" exitCode=0 Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.018014 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d74fd689-q5qhb" event={"ID":"5c88f138-094d-44c0-b1c9-1492e7e11e9b","Type":"ContainerDied","Data":"e28ba51730232a08a3cd5dc96327f73be33823e4b79e43d0c66d0800f455e9e0"} Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.018560 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xsfs\" (UniqueName: \"kubernetes.io/projected/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-kube-api-access-6xsfs\") pod \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.018629 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-log-httpd\") pod \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.018703 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-run-httpd\") pod \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.018776 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-config-data\") pod \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.019607 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-combined-ca-bundle\") pod \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.019646 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-sg-core-conf-yaml\") pod \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\" (UID: \"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.026662 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" (UID: "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.028722 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" (UID: "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.029980 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-scripts" (OuterVolumeSpecName: "scripts") pod "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" (UID: "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.031295 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16a3e23c-d8b4-4030-ad8e-f12ffc069564","Type":"ContainerStarted","Data":"ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967"} Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.035611 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-kube-api-access-6xsfs" (OuterVolumeSpecName: "kube-api-access-6xsfs") pod "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" (UID: "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16"). InnerVolumeSpecName "kube-api-access-6xsfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.044327 4722 generic.go:334] "Generic (PLEG): container finished" podID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerID="891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351" exitCode=0 Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.044609 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16","Type":"ContainerDied","Data":"891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351"} Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.044664 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc3f8aa2-4ca7-440c-9fb9-3707e404ce16","Type":"ContainerDied","Data":"1faa29ce27320ab22dc6db2828db88d540021f7a0832148de51b439f8684b1f0"} Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.044682 4722 scope.go:117] "RemoveContainer" containerID="81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.044860 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.072221 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" (UID: "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.129241 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.129532 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.129545 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.129558 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xsfs\" (UniqueName: \"kubernetes.io/projected/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-kube-api-access-6xsfs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.129569 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.159941 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" (UID: "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.172167 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-config-data" (OuterVolumeSpecName: "config-data") pod "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" (UID: "dc3f8aa2-4ca7-440c-9fb9-3707e404ce16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.232128 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.232185 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.271934 4722 scope.go:117] "RemoveContainer" containerID="5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.285170 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.331607 4722 scope.go:117] "RemoveContainer" containerID="891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.332820 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-config\") pod \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.332871 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-httpd-config\") pod \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.332937 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-internal-tls-certs\") pod \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.333600 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-ovndb-tls-certs\") pod \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.333639 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-combined-ca-bundle\") pod \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.333737 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-public-tls-certs\") pod \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.333795 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2qkz\" (UniqueName: \"kubernetes.io/projected/5c88f138-094d-44c0-b1c9-1492e7e11e9b-kube-api-access-c2qkz\") pod \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\" (UID: \"5c88f138-094d-44c0-b1c9-1492e7e11e9b\") " Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.337634 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c88f138-094d-44c0-b1c9-1492e7e11e9b-kube-api-access-c2qkz" (OuterVolumeSpecName: "kube-api-access-c2qkz") pod "5c88f138-094d-44c0-b1c9-1492e7e11e9b" (UID: "5c88f138-094d-44c0-b1c9-1492e7e11e9b"). InnerVolumeSpecName "kube-api-access-c2qkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.346433 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5c88f138-094d-44c0-b1c9-1492e7e11e9b" (UID: "5c88f138-094d-44c0-b1c9-1492e7e11e9b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.436802 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2qkz\" (UniqueName: \"kubernetes.io/projected/5c88f138-094d-44c0-b1c9-1492e7e11e9b-kube-api-access-c2qkz\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.436846 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.456405 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5c88f138-094d-44c0-b1c9-1492e7e11e9b" (UID: "5c88f138-094d-44c0-b1c9-1492e7e11e9b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.463887 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "5c88f138-094d-44c0-b1c9-1492e7e11e9b" (UID: "5c88f138-094d-44c0-b1c9-1492e7e11e9b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.470491 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-config" (OuterVolumeSpecName: "config") pod "5c88f138-094d-44c0-b1c9-1492e7e11e9b" (UID: "5c88f138-094d-44c0-b1c9-1492e7e11e9b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.484415 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5c88f138-094d-44c0-b1c9-1492e7e11e9b" (UID: "5c88f138-094d-44c0-b1c9-1492e7e11e9b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.505169 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c88f138-094d-44c0-b1c9-1492e7e11e9b" (UID: "5c88f138-094d-44c0-b1c9-1492e7e11e9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.540119 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.540221 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.540236 4722 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.540247 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.540260 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c88f138-094d-44c0-b1c9-1492e7e11e9b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.593300 4722 scope.go:117] "RemoveContainer" containerID="6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.615268 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.626576 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650246 4722 scope.go:117] "RemoveContainer" containerID="81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650277 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.650670 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="ceilometer-notification-agent" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650681 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="ceilometer-notification-agent" Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.650698 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" containerName="init" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650705 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" containerName="init" Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.650724 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="ceilometer-central-agent" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650731 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="ceilometer-central-agent" Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.650745 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerName="neutron-httpd" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650750 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerName="neutron-httpd" Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.650762 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerName="neutron-api" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650768 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerName="neutron-api" Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.650780 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="sg-core" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650786 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="sg-core" Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.650794 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="proxy-httpd" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650800 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="proxy-httpd" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650968 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="proxy-httpd" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650979 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerName="neutron-httpd" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650988 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerName="neutron-api" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.650998 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="ceilometer-central-agent" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.651012 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="sg-core" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.651024 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" containerName="init" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.651035 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" containerName="ceilometer-notification-agent" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.652687 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.653038 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c\": container with ID starting with 81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c not found: ID does not exist" containerID="81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.653069 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c"} err="failed to get container status \"81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c\": rpc error: code = NotFound desc = could not find container \"81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c\": container with ID starting with 81ae974b88b7632b1ad927ae183d067374229dddc00dbbf6a182a8cc7471418c not found: ID does not exist" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.653090 4722 scope.go:117] "RemoveContainer" containerID="5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.654713 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.655978 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.657972 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd\": container with ID starting with 5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd not found: ID does not exist" containerID="5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.658026 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd"} err="failed to get container status \"5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd\": rpc error: code = NotFound desc = could not find container \"5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd\": container with ID starting with 5a2991a9af479277078b05357d89d46963130ad6f0394b960ccb5f91d3e086dd not found: ID does not exist" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.658058 4722 scope.go:117] "RemoveContainer" containerID="891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351" Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.659589 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351\": container with ID starting with 891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351 not found: ID does not exist" containerID="891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.659622 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351"} err="failed to get container status \"891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351\": rpc error: code = NotFound desc = could not find container \"891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351\": container with ID starting with 891fdd697c67b2e457dbf66de618190ddf2d6b116256379de8df8e8a03f2e351 not found: ID does not exist" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.659646 4722 scope.go:117] "RemoveContainer" containerID="6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14" Feb 19 19:39:06 crc kubenswrapper[4722]: E0219 19:39:06.659954 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14\": container with ID starting with 6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14 not found: ID does not exist" containerID="6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.659978 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14"} err="failed to get container status \"6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14\": rpc error: code = NotFound desc = could not find container \"6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14\": container with ID starting with 6b45898a524bb955ce041f2db5099c312a6ad52b8fdc31a63d3a6ed6ae14fc14 not found: ID does not exist" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.663217 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.745566 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-run-httpd\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.745724 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-config-data\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.746945 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.747007 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-scripts\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.747047 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.747144 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72xgb\" (UniqueName: \"kubernetes.io/projected/41000a66-e725-4b1e-ab9c-31251213e311-kube-api-access-72xgb\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.747367 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-log-httpd\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.761162 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.848936 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-run-httpd\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.849005 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-config-data\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.849034 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.850383 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-scripts\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.850431 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.850470 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72xgb\" (UniqueName: \"kubernetes.io/projected/41000a66-e725-4b1e-ab9c-31251213e311-kube-api-access-72xgb\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.850568 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-log-httpd\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.850915 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-run-httpd\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.851103 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-log-httpd\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.854658 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.855847 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.863923 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-config-data\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.864920 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-scripts\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:06 crc kubenswrapper[4722]: I0219 19:39:06.871459 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72xgb\" (UniqueName: \"kubernetes.io/projected/41000a66-e725-4b1e-ab9c-31251213e311-kube-api-access-72xgb\") pod \"ceilometer-0\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " pod="openstack/ceilometer-0" Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.057591 4722 generic.go:334] "Generic (PLEG): container finished" podID="1725704f-c153-4de4-9246-87c6a5e878ea" containerID="13413006ae1624571bd31498af1bfba16b06dc1ae973f9ef0d89f06ecc4ef187" exitCode=0 Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.057654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-86mtg" event={"ID":"1725704f-c153-4de4-9246-87c6a5e878ea","Type":"ContainerDied","Data":"13413006ae1624571bd31498af1bfba16b06dc1ae973f9ef0d89f06ecc4ef187"} Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.063236 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" event={"ID":"96ffdf9d-f932-419b-be31-9f38358d2db5","Type":"ContainerStarted","Data":"c18c2ab41b489e7badb5ac98b2e3c4606d65918c65f00e12eb38be57c9fae474"} Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.083732 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d74fd689-q5qhb" Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.088841 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.119250 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5" path="/var/lib/kubelet/pods/d5ef6823-2e42-41cf-8eda-f9ea51c8c6f5/volumes" Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.119945 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc3f8aa2-4ca7-440c-9fb9-3707e404ce16" path="/var/lib/kubelet/pods/dc3f8aa2-4ca7-440c-9fb9-3707e404ce16/volumes" Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.122557 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.122587 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" event={"ID":"2e73e983-eb03-4734-838f-85a759275b7a","Type":"ContainerStarted","Data":"fa879b1e3253fc820960a3e8dedf144b6c552a5104bf1f7847f60ac303f8ee16"} Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.122611 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6767bd5ccf-ggbrg" event={"ID":"66f5042d-2b30-4ac4-8594-cfc0f9590460","Type":"ContainerStarted","Data":"ed273f397b825fdd8bdb6f68de94a3ee0db9b04b68950081332c2fde978e7ba0"} Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.122627 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8694c7b8f7-2td8g" event={"ID":"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b","Type":"ContainerStarted","Data":"a3bd7013e2ba7173aa28bb7c35ee99099b95a80e1d8988098d53c80782aa5146"} Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.122639 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d74fd689-q5qhb" event={"ID":"5c88f138-094d-44c0-b1c9-1492e7e11e9b","Type":"ContainerDied","Data":"245b2a4bf08b03ca07fdc608528d3501f8e470227ac611d75e1e28818470fe64"} Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.122927 4722 scope.go:117] "RemoveContainer" containerID="6cecb6c27a5d8a3d6ffee2f1f0d633c671295bd59fc22535a5bf9eb9959995c0" Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.137449 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" podStartSLOduration=4.137431076 podStartE2EDuration="4.137431076s" podCreationTimestamp="2026-02-19 19:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:07.107944168 +0000 UTC m=+1246.720294492" watchObservedRunningTime="2026-02-19 19:39:07.137431076 +0000 UTC m=+1246.749781400" Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.149428 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d74fd689-q5qhb"] Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.174429 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-d74fd689-q5qhb"] Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.175386 4722 scope.go:117] "RemoveContainer" containerID="e28ba51730232a08a3cd5dc96327f73be33823e4b79e43d0c66d0800f455e9e0" Feb 19 19:39:07 crc kubenswrapper[4722]: I0219 19:39:07.680244 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.096418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" event={"ID":"96ffdf9d-f932-419b-be31-9f38358d2db5","Type":"ContainerStarted","Data":"562d790cafd82cb261abee24047ccdf78c04de4bc01c721b2bc25d10c0b503fa"} Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.097894 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41000a66-e725-4b1e-ab9c-31251213e311","Type":"ContainerStarted","Data":"2fc5d288c8b590c8621fd130a7dd63655d59f6c92407b8882f0ffae525ddf63d"} Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.100391 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6767bd5ccf-ggbrg" event={"ID":"66f5042d-2b30-4ac4-8594-cfc0f9590460","Type":"ContainerStarted","Data":"7e892a24818bdb1b487baebbcd5ac410109432e5dcc2bf11f41b117cd4d8ca06"} Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.103094 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8694c7b8f7-2td8g" event={"ID":"a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b","Type":"ContainerStarted","Data":"120a7a51f92ae71b94e908bdc8f0169f02eb2c6093c0a16f931af3f4da30a580"} Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.103303 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.106364 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16a3e23c-d8b4-4030-ad8e-f12ffc069564","Type":"ContainerStarted","Data":"f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0"} Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.106495 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" containerName="cinder-api" containerID="cri-o://f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0" gracePeriod=30 Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.106513 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.106489 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" containerName="cinder-api-log" containerID="cri-o://ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967" gracePeriod=30 Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.116511 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d","Type":"ContainerStarted","Data":"9d72b6d3788d72322e0731586a982dd0b1c77afd8b08087faa403a0b4e5395dd"} Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.116575 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d","Type":"ContainerStarted","Data":"923d05da4abe9a12313b3a4fc1af83169003ed9d770ecb195f6c8cd32223d17f"} Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.125787 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-98b54b474-9tfhf" podStartSLOduration=3.75351923 podStartE2EDuration="7.125770079s" podCreationTimestamp="2026-02-19 19:39:01 +0000 UTC" firstStartedPulling="2026-02-19 19:39:03.019216095 +0000 UTC m=+1242.631566419" lastFinishedPulling="2026-02-19 19:39:06.391466944 +0000 UTC m=+1246.003817268" observedRunningTime="2026-02-19 19:39:08.120387172 +0000 UTC m=+1247.732737516" watchObservedRunningTime="2026-02-19 19:39:08.125770079 +0000 UTC m=+1247.738120403" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.152704 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-8694c7b8f7-2td8g" podStartSLOduration=5.152679786 podStartE2EDuration="5.152679786s" podCreationTimestamp="2026-02-19 19:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:08.145570495 +0000 UTC m=+1247.757920829" watchObservedRunningTime="2026-02-19 19:39:08.152679786 +0000 UTC m=+1247.765030110" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.170886 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.123168857 podStartE2EDuration="5.170868783s" podCreationTimestamp="2026-02-19 19:39:03 +0000 UTC" firstStartedPulling="2026-02-19 19:39:04.224439426 +0000 UTC m=+1243.836789750" lastFinishedPulling="2026-02-19 19:39:06.272139352 +0000 UTC m=+1245.884489676" observedRunningTime="2026-02-19 19:39:08.169011904 +0000 UTC m=+1247.781362238" watchObservedRunningTime="2026-02-19 19:39:08.170868783 +0000 UTC m=+1247.783219127" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.209046 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6767bd5ccf-ggbrg" podStartSLOduration=3.593083886 podStartE2EDuration="7.209027309s" podCreationTimestamp="2026-02-19 19:39:01 +0000 UTC" firstStartedPulling="2026-02-19 19:39:02.755675724 +0000 UTC m=+1242.368026048" lastFinishedPulling="2026-02-19 19:39:06.371619147 +0000 UTC m=+1245.983969471" observedRunningTime="2026-02-19 19:39:08.198552904 +0000 UTC m=+1247.810903238" watchObservedRunningTime="2026-02-19 19:39:08.209027309 +0000 UTC m=+1247.821377643" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.275258 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.275232029 podStartE2EDuration="5.275232029s" podCreationTimestamp="2026-02-19 19:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:08.245405331 +0000 UTC m=+1247.857755655" watchObservedRunningTime="2026-02-19 19:39:08.275232029 +0000 UTC m=+1247.887582363" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.510380 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.749117 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.789920 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-546c4d4684-6vk7j"] Feb 19 19:39:08 crc kubenswrapper[4722]: E0219 19:39:08.790381 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1725704f-c153-4de4-9246-87c6a5e878ea" containerName="cloudkitty-storageinit" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.790394 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1725704f-c153-4de4-9246-87c6a5e878ea" containerName="cloudkitty-storageinit" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.790589 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1725704f-c153-4de4-9246-87c6a5e878ea" containerName="cloudkitty-storageinit" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.791643 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.795259 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.795411 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.832208 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-546c4d4684-6vk7j"] Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.856700 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.904197 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-combined-ca-bundle\") pod \"1725704f-c153-4de4-9246-87c6a5e878ea\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.904270 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw9tj\" (UniqueName: \"kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-kube-api-access-zw9tj\") pod \"1725704f-c153-4de4-9246-87c6a5e878ea\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.904289 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-scripts\") pod \"1725704f-c153-4de4-9246-87c6a5e878ea\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.905305 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-config-data\") pod \"1725704f-c153-4de4-9246-87c6a5e878ea\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.905341 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-certs\") pod \"1725704f-c153-4de4-9246-87c6a5e878ea\" (UID: \"1725704f-c153-4de4-9246-87c6a5e878ea\") " Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.905690 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-config-data\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.905997 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7701b23-dddb-4a45-8982-11ab69bc30b1-logs\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.906068 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-public-tls-certs\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.906183 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-config-data-custom\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.906275 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-internal-tls-certs\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.906497 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptvm4\" (UniqueName: \"kubernetes.io/projected/a7701b23-dddb-4a45-8982-11ab69bc30b1-kube-api-access-ptvm4\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.906700 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-combined-ca-bundle\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.914280 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-kube-api-access-zw9tj" (OuterVolumeSpecName: "kube-api-access-zw9tj") pod "1725704f-c153-4de4-9246-87c6a5e878ea" (UID: "1725704f-c153-4de4-9246-87c6a5e878ea"). InnerVolumeSpecName "kube-api-access-zw9tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.918580 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-scripts" (OuterVolumeSpecName: "scripts") pod "1725704f-c153-4de4-9246-87c6a5e878ea" (UID: "1725704f-c153-4de4-9246-87c6a5e878ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.919222 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-certs" (OuterVolumeSpecName: "certs") pod "1725704f-c153-4de4-9246-87c6a5e878ea" (UID: "1725704f-c153-4de4-9246-87c6a5e878ea"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.948767 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-config-data" (OuterVolumeSpecName: "config-data") pod "1725704f-c153-4de4-9246-87c6a5e878ea" (UID: "1725704f-c153-4de4-9246-87c6a5e878ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:08 crc kubenswrapper[4722]: I0219 19:39:08.951523 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1725704f-c153-4de4-9246-87c6a5e878ea" (UID: "1725704f-c153-4de4-9246-87c6a5e878ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.008567 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7l94k\" (UniqueName: \"kubernetes.io/projected/16a3e23c-d8b4-4030-ad8e-f12ffc069564-kube-api-access-7l94k\") pod \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.008613 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16a3e23c-d8b4-4030-ad8e-f12ffc069564-etc-machine-id\") pod \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.008683 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16a3e23c-d8b4-4030-ad8e-f12ffc069564-logs\") pod \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.008728 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-combined-ca-bundle\") pod \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.008790 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data\") pod \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.008810 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-scripts\") pod \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.008925 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data-custom\") pod \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\" (UID: \"16a3e23c-d8b4-4030-ad8e-f12ffc069564\") " Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009232 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-config-data\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009277 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7701b23-dddb-4a45-8982-11ab69bc30b1-logs\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-public-tls-certs\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009374 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-config-data-custom\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009397 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-internal-tls-certs\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009429 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptvm4\" (UniqueName: \"kubernetes.io/projected/a7701b23-dddb-4a45-8982-11ab69bc30b1-kube-api-access-ptvm4\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009468 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-combined-ca-bundle\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009563 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009581 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009590 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009601 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw9tj\" (UniqueName: \"kubernetes.io/projected/1725704f-c153-4de4-9246-87c6a5e878ea-kube-api-access-zw9tj\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009609 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1725704f-c153-4de4-9246-87c6a5e878ea-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.009571 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16a3e23c-d8b4-4030-ad8e-f12ffc069564-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "16a3e23c-d8b4-4030-ad8e-f12ffc069564" (UID: "16a3e23c-d8b4-4030-ad8e-f12ffc069564"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.010208 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16a3e23c-d8b4-4030-ad8e-f12ffc069564-logs" (OuterVolumeSpecName: "logs") pod "16a3e23c-d8b4-4030-ad8e-f12ffc069564" (UID: "16a3e23c-d8b4-4030-ad8e-f12ffc069564"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.015932 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7701b23-dddb-4a45-8982-11ab69bc30b1-logs\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.019042 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16a3e23c-d8b4-4030-ad8e-f12ffc069564-kube-api-access-7l94k" (OuterVolumeSpecName: "kube-api-access-7l94k") pod "16a3e23c-d8b4-4030-ad8e-f12ffc069564" (UID: "16a3e23c-d8b4-4030-ad8e-f12ffc069564"). InnerVolumeSpecName "kube-api-access-7l94k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.025913 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-public-tls-certs\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.025979 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-combined-ca-bundle\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.026007 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-config-data-custom\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.026019 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-scripts" (OuterVolumeSpecName: "scripts") pod "16a3e23c-d8b4-4030-ad8e-f12ffc069564" (UID: "16a3e23c-d8b4-4030-ad8e-f12ffc069564"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.026619 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-internal-tls-certs\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.027719 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7701b23-dddb-4a45-8982-11ab69bc30b1-config-data\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.032963 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "16a3e23c-d8b4-4030-ad8e-f12ffc069564" (UID: "16a3e23c-d8b4-4030-ad8e-f12ffc069564"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.040100 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptvm4\" (UniqueName: \"kubernetes.io/projected/a7701b23-dddb-4a45-8982-11ab69bc30b1-kube-api-access-ptvm4\") pod \"barbican-api-546c4d4684-6vk7j\" (UID: \"a7701b23-dddb-4a45-8982-11ab69bc30b1\") " pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.050388 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16a3e23c-d8b4-4030-ad8e-f12ffc069564" (UID: "16a3e23c-d8b4-4030-ad8e-f12ffc069564"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.082398 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data" (OuterVolumeSpecName: "config-data") pod "16a3e23c-d8b4-4030-ad8e-f12ffc069564" (UID: "16a3e23c-d8b4-4030-ad8e-f12ffc069564"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.084133 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" path="/var/lib/kubelet/pods/5c88f138-094d-44c0-b1c9-1492e7e11e9b/volumes" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.112002 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16a3e23c-d8b4-4030-ad8e-f12ffc069564-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.112032 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.112042 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.112052 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.112062 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/16a3e23c-d8b4-4030-ad8e-f12ffc069564-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.112070 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7l94k\" (UniqueName: \"kubernetes.io/projected/16a3e23c-d8b4-4030-ad8e-f12ffc069564-kube-api-access-7l94k\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.112080 4722 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/16a3e23c-d8b4-4030-ad8e-f12ffc069564-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.140962 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.152485 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41000a66-e725-4b1e-ab9c-31251213e311","Type":"ContainerStarted","Data":"51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d"} Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.161041 4722 generic.go:334] "Generic (PLEG): container finished" podID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" containerID="f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0" exitCode=0 Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.161073 4722 generic.go:334] "Generic (PLEG): container finished" podID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" containerID="ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967" exitCode=143 Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.161197 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16a3e23c-d8b4-4030-ad8e-f12ffc069564","Type":"ContainerDied","Data":"f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0"} Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.161226 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16a3e23c-d8b4-4030-ad8e-f12ffc069564","Type":"ContainerDied","Data":"ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967"} Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.161236 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"16a3e23c-d8b4-4030-ad8e-f12ffc069564","Type":"ContainerDied","Data":"222656de90eea2fb87119d4565ce1d73241b41489eaee592945bf60298d04d35"} Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.161257 4722 scope.go:117] "RemoveContainer" containerID="f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.161395 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.174978 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-86mtg" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.176115 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-86mtg" event={"ID":"1725704f-c153-4de4-9246-87c6a5e878ea","Type":"ContainerDied","Data":"ea7852dcadbb3212d9207882980f204f6c637ee58504de45986bee8494bbea9e"} Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.176219 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea7852dcadbb3212d9207882980f204f6c637ee58504de45986bee8494bbea9e" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.230294 4722 scope.go:117] "RemoveContainer" containerID="ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.282466 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.312342 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.315323 4722 scope.go:117] "RemoveContainer" containerID="f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0" Feb 19 19:39:09 crc kubenswrapper[4722]: E0219 19:39:09.316759 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0\": container with ID starting with f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0 not found: ID does not exist" containerID="f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.316799 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0"} err="failed to get container status \"f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0\": rpc error: code = NotFound desc = could not find container \"f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0\": container with ID starting with f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0 not found: ID does not exist" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.316822 4722 scope.go:117] "RemoveContainer" containerID="ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967" Feb 19 19:39:09 crc kubenswrapper[4722]: E0219 19:39:09.317928 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967\": container with ID starting with ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967 not found: ID does not exist" containerID="ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.317971 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967"} err="failed to get container status \"ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967\": rpc error: code = NotFound desc = could not find container \"ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967\": container with ID starting with ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967 not found: ID does not exist" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.317997 4722 scope.go:117] "RemoveContainer" containerID="f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.318552 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0"} err="failed to get container status \"f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0\": rpc error: code = NotFound desc = could not find container \"f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0\": container with ID starting with f45b7c4a1d36e2451daa588c5bc2b1a1cbb35219b1a9f433981a7259ed0fd4b0 not found: ID does not exist" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.318580 4722 scope.go:117] "RemoveContainer" containerID="ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.318938 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967"} err="failed to get container status \"ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967\": rpc error: code = NotFound desc = could not find container \"ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967\": container with ID starting with ca78f91b1e0769c4f2204fa50b89d2a522f100ddfccdd102e7ae6f1adfce1967 not found: ID does not exist" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.355811 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 19:39:09 crc kubenswrapper[4722]: E0219 19:39:09.356222 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" containerName="cinder-api" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.356236 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" containerName="cinder-api" Feb 19 19:39:09 crc kubenswrapper[4722]: E0219 19:39:09.356256 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" containerName="cinder-api-log" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.356263 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" containerName="cinder-api-log" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.356439 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" containerName="cinder-api-log" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.356468 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" containerName="cinder-api" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.357749 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.368402 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.368664 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.368864 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.385112 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.421481 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.422666 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.440556 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.451751 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.452107 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-bnkq4" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.452311 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.452485 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.452642 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.461389 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7hq9c"] Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.461623 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" podUID="2e73e983-eb03-4734-838f-85a759275b7a" containerName="dnsmasq-dns" containerID="cri-o://fa879b1e3253fc820960a3e8dedf144b6c552a5104bf1f7847f60ac303f8ee16" gracePeriod=10 Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.523780 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.523845 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-config-data-custom\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.523869 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-config-data\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.523885 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-scripts\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.523909 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-scripts\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.523927 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.523945 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.523964 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.524018 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c8e6512-8007-4e99-8589-8dccb1975e3f-logs\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.524043 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.524068 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-certs\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.524091 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4frh\" (UniqueName: \"kubernetes.io/projected/8c8e6512-8007-4e99-8589-8dccb1975e3f-kube-api-access-p4frh\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.524135 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c8e6512-8007-4e99-8589-8dccb1975e3f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.524168 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p26ng\" (UniqueName: \"kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-kube-api-access-p26ng\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.524219 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.606136 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-2g6g8"] Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.607964 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.637868 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-scripts\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638141 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638176 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638193 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638220 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-svc\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638239 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638273 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c8e6512-8007-4e99-8589-8dccb1975e3f-logs\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638290 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kgkx\" (UniqueName: \"kubernetes.io/projected/8f530e65-8397-49d6-929a-201bb5dfe585-kube-api-access-5kgkx\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638307 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638330 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638347 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-certs\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638364 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4frh\" (UniqueName: \"kubernetes.io/projected/8c8e6512-8007-4e99-8589-8dccb1975e3f-kube-api-access-p4frh\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638395 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-config\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638418 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c8e6512-8007-4e99-8589-8dccb1975e3f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638447 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p26ng\" (UniqueName: \"kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-kube-api-access-p26ng\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638479 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638533 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638559 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-config-data-custom\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638580 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-config-data\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638595 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-scripts\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.638609 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.639821 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8c8e6512-8007-4e99-8589-8dccb1975e3f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.640513 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c8e6512-8007-4e99-8589-8dccb1975e3f-logs\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.654800 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.656787 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-config-data-custom\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.661813 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.665454 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-2g6g8"] Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.669703 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-certs\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.671919 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.677106 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.677564 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-scripts\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.682898 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4frh\" (UniqueName: \"kubernetes.io/projected/8c8e6512-8007-4e99-8589-8dccb1975e3f-kube-api-access-p4frh\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.696861 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-config-data\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.710772 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p26ng\" (UniqueName: \"kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-kube-api-access-p26ng\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.735837 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.740329 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.740415 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-svc\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.740433 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.740475 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kgkx\" (UniqueName: \"kubernetes.io/projected/8f530e65-8397-49d6-929a-201bb5dfe585-kube-api-access-5kgkx\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.740506 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.740548 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-config\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.741613 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-config\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.742123 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.742678 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-svc\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.754379 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.754782 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.754934 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8e6512-8007-4e99-8589-8dccb1975e3f-scripts\") pod \"cinder-api-0\" (UID: \"8c8e6512-8007-4e99-8589-8dccb1975e3f\") " pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.780780 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.798470 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.799036 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kgkx\" (UniqueName: \"kubernetes.io/projected/8f530e65-8397-49d6-929a-201bb5dfe585-kube-api-access-5kgkx\") pod \"dnsmasq-dns-67bdc55879-2g6g8\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.800040 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.832193 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.835884 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.837648 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.889017 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.946129 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-certs\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.946297 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-scripts\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.946432 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtbhb\" (UniqueName: \"kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-kube-api-access-jtbhb\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.946570 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.946603 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.946898 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.946990 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d55206-1b8d-4013-a42b-d7e634815929-logs\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:09 crc kubenswrapper[4722]: E0219 19:39:09.984048 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e73e983_eb03_4734_838f_85a759275b7a.slice/crio-fa879b1e3253fc820960a3e8dedf144b6c552a5104bf1f7847f60ac303f8ee16.scope\": RecentStats: unable to find data in memory cache]" Feb 19 19:39:09 crc kubenswrapper[4722]: I0219 19:39:09.988298 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.051236 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-scripts\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.051296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtbhb\" (UniqueName: \"kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-kube-api-access-jtbhb\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.051337 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.051369 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.051442 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.051471 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d55206-1b8d-4013-a42b-d7e634815929-logs\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.051605 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-certs\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.052296 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d55206-1b8d-4013-a42b-d7e634815929-logs\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.061013 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-scripts\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.061432 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.062872 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-certs\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.079748 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.079954 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.088705 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtbhb\" (UniqueName: \"kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-kube-api-access-jtbhb\") pod \"cloudkitty-api-0\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.256928 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-546c4d4684-6vk7j"] Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.279365 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e73e983-eb03-4734-838f-85a759275b7a" containerID="fa879b1e3253fc820960a3e8dedf144b6c552a5104bf1f7847f60ac303f8ee16" exitCode=0 Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.279399 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" event={"ID":"2e73e983-eb03-4734-838f-85a759275b7a","Type":"ContainerDied","Data":"fa879b1e3253fc820960a3e8dedf144b6c552a5104bf1f7847f60ac303f8ee16"} Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.285134 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41000a66-e725-4b1e-ab9c-31251213e311","Type":"ContainerStarted","Data":"870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5"} Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.327204 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.693384 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.784949 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-nb\") pod \"2e73e983-eb03-4734-838f-85a759275b7a\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.785603 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zffz\" (UniqueName: \"kubernetes.io/projected/2e73e983-eb03-4734-838f-85a759275b7a-kube-api-access-4zffz\") pod \"2e73e983-eb03-4734-838f-85a759275b7a\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.785811 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-swift-storage-0\") pod \"2e73e983-eb03-4734-838f-85a759275b7a\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.785935 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-config\") pod \"2e73e983-eb03-4734-838f-85a759275b7a\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.786037 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-sb\") pod \"2e73e983-eb03-4734-838f-85a759275b7a\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.786822 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-svc\") pod \"2e73e983-eb03-4734-838f-85a759275b7a\" (UID: \"2e73e983-eb03-4734-838f-85a759275b7a\") " Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.800494 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e73e983-eb03-4734-838f-85a759275b7a-kube-api-access-4zffz" (OuterVolumeSpecName: "kube-api-access-4zffz") pod "2e73e983-eb03-4734-838f-85a759275b7a" (UID: "2e73e983-eb03-4734-838f-85a759275b7a"). InnerVolumeSpecName "kube-api-access-4zffz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.890663 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zffz\" (UniqueName: \"kubernetes.io/projected/2e73e983-eb03-4734-838f-85a759275b7a-kube-api-access-4zffz\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.902724 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 19:39:10 crc kubenswrapper[4722]: W0219 19:39:10.921641 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8676c8db_d85f_44d2_ae94_560542a5cbf3.slice/crio-cf89ffbf474dc1e5f2a9ec1d323956a0406b1e6ce7a0ddc5729131b991819812 WatchSource:0}: Error finding container cf89ffbf474dc1e5f2a9ec1d323956a0406b1e6ce7a0ddc5729131b991819812: Status 404 returned error can't find the container with id cf89ffbf474dc1e5f2a9ec1d323956a0406b1e6ce7a0ddc5729131b991819812 Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.937657 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.961598 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2e73e983-eb03-4734-838f-85a759275b7a" (UID: "2e73e983-eb03-4734-838f-85a759275b7a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.968640 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-2g6g8"] Feb 19 19:39:10 crc kubenswrapper[4722]: I0219 19:39:10.992308 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.011684 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2e73e983-eb03-4734-838f-85a759275b7a" (UID: "2e73e983-eb03-4734-838f-85a759275b7a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.016181 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2e73e983-eb03-4734-838f-85a759275b7a" (UID: "2e73e983-eb03-4734-838f-85a759275b7a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.021886 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-config" (OuterVolumeSpecName: "config") pod "2e73e983-eb03-4734-838f-85a759275b7a" (UID: "2e73e983-eb03-4734-838f-85a759275b7a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.023616 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2e73e983-eb03-4734-838f-85a759275b7a" (UID: "2e73e983-eb03-4734-838f-85a759275b7a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.093661 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.093685 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.093700 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.093708 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e73e983-eb03-4734-838f-85a759275b7a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.116522 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16a3e23c-d8b4-4030-ad8e-f12ffc069564" path="/var/lib/kubelet/pods/16a3e23c-d8b4-4030-ad8e-f12ffc069564/volumes" Feb 19 19:39:11 crc kubenswrapper[4722]: W0219 19:39:11.182292 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7d55206_1b8d_4013_a42b_d7e634815929.slice/crio-0f03ba3ad2b59fa11ba2712209a1fe901bae7bd74fce14edbb9fe896e87c9bc2 WatchSource:0}: Error finding container 0f03ba3ad2b59fa11ba2712209a1fe901bae7bd74fce14edbb9fe896e87c9bc2: Status 404 returned error can't find the container with id 0f03ba3ad2b59fa11ba2712209a1fe901bae7bd74fce14edbb9fe896e87c9bc2 Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.230945 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.307011 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" event={"ID":"2e73e983-eb03-4734-838f-85a759275b7a","Type":"ContainerDied","Data":"39341497cf9614456c8136bec5d4742d83abccc290b3636722c56ae71d3a4127"} Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.307059 4722 scope.go:117] "RemoveContainer" containerID="fa879b1e3253fc820960a3e8dedf144b6c552a5104bf1f7847f60ac303f8ee16" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.307215 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-7hq9c" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.315882 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e7d55206-1b8d-4013-a42b-d7e634815929","Type":"ContainerStarted","Data":"0f03ba3ad2b59fa11ba2712209a1fe901bae7bd74fce14edbb9fe896e87c9bc2"} Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.319711 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8c8e6512-8007-4e99-8589-8dccb1975e3f","Type":"ContainerStarted","Data":"3379a200763a43f81a93e7484422905bc009b41cd937c2876aa4df4396fe51aa"} Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.351990 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41000a66-e725-4b1e-ab9c-31251213e311","Type":"ContainerStarted","Data":"281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b"} Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.360408 4722 generic.go:334] "Generic (PLEG): container finished" podID="8f530e65-8397-49d6-929a-201bb5dfe585" containerID="380c536ebfd3cf4e5ded9eb26bb64cd838a985f8d5ba0c199a97d05a07b511f3" exitCode=0 Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.360489 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" event={"ID":"8f530e65-8397-49d6-929a-201bb5dfe585","Type":"ContainerDied","Data":"380c536ebfd3cf4e5ded9eb26bb64cd838a985f8d5ba0c199a97d05a07b511f3"} Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.360514 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" event={"ID":"8f530e65-8397-49d6-929a-201bb5dfe585","Type":"ContainerStarted","Data":"6a529cc3a96af23463f3dfa462bf02cb46f29fb8e36534fccb322ef7ab7a6728"} Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.381718 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-546c4d4684-6vk7j" event={"ID":"a7701b23-dddb-4a45-8982-11ab69bc30b1","Type":"ContainerStarted","Data":"61aaad90ae962a83d80291b7e325d625dae8793c7d5caf63aeaf7adc9417ebd3"} Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.381776 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-546c4d4684-6vk7j" event={"ID":"a7701b23-dddb-4a45-8982-11ab69bc30b1","Type":"ContainerStarted","Data":"b5f7b05354b2ad7eb50e5be20f018c5ba3940cc461bcda1e9b151c7a789fd61d"} Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.381790 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-546c4d4684-6vk7j" event={"ID":"a7701b23-dddb-4a45-8982-11ab69bc30b1","Type":"ContainerStarted","Data":"50659fb6df6ab070768ddc57a8fa622cf5a2601aadff9c10ef50cb62fbc11144"} Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.382757 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.382789 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.411279 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"8676c8db-d85f-44d2-ae94-560542a5cbf3","Type":"ContainerStarted","Data":"cf89ffbf474dc1e5f2a9ec1d323956a0406b1e6ce7a0ddc5729131b991819812"} Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.459997 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-546c4d4684-6vk7j" podStartSLOduration=3.459981016 podStartE2EDuration="3.459981016s" podCreationTimestamp="2026-02-19 19:39:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:11.431431567 +0000 UTC m=+1251.043781901" watchObservedRunningTime="2026-02-19 19:39:11.459981016 +0000 UTC m=+1251.072331340" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.596307 4722 scope.go:117] "RemoveContainer" containerID="b58a92dec7aa9fa905d85fdb92866ee986e06e5d51cc5b90911bb9db7cccb1d3" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.629559 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7hq9c"] Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.650950 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-7hq9c"] Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.803700 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.803748 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.803785 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.804556 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f9ea5233c8da68a82202932b76beffc960ff77ead8fdc47e6fb7d01f484e9a5"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:39:11 crc kubenswrapper[4722]: I0219 19:39:11.804603 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://3f9ea5233c8da68a82202932b76beffc960ff77ead8fdc47e6fb7d01f484e9a5" gracePeriod=600 Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.389237 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.459400 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8c8e6512-8007-4e99-8589-8dccb1975e3f","Type":"ContainerStarted","Data":"231b9df70e244176ab3c47cc1a307eb91b71a9d5a2d108d53dcbbf64a3791510"} Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.471246 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="3f9ea5233c8da68a82202932b76beffc960ff77ead8fdc47e6fb7d01f484e9a5" exitCode=0 Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.471300 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"3f9ea5233c8da68a82202932b76beffc960ff77ead8fdc47e6fb7d01f484e9a5"} Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.471325 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"5d87fcbd7a996e41ecc379a7fc5d8fec55b99f8916d82ec5d3e1bb7181cace17"} Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.471340 4722 scope.go:117] "RemoveContainer" containerID="d8ceb58059028fac39dbad274e30d4a3cfc17b7b996b2c7fee64b6d0dd4a36f1" Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.476412 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" event={"ID":"8f530e65-8397-49d6-929a-201bb5dfe585","Type":"ContainerStarted","Data":"b0f785695269b6ae9fc48dfba62c1a732aa42aadccca7a02f2d798ea3429fbac"} Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.476514 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.525619 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e7d55206-1b8d-4013-a42b-d7e634815929","Type":"ContainerStarted","Data":"ece5fa600d0e9bf914182b73d6050f61408fd3ed43685cff4305bd053c11121a"} Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.525654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e7d55206-1b8d-4013-a42b-d7e634815929","Type":"ContainerStarted","Data":"72589df96ce77377988fc6fdfb157f998e009b737cb3d3fbc7c7fc30136cf951"} Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.525668 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.545987 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" podStartSLOduration=3.5459657570000003 podStartE2EDuration="3.545965757s" podCreationTimestamp="2026-02-19 19:39:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:12.540107925 +0000 UTC m=+1252.152458259" watchObservedRunningTime="2026-02-19 19:39:12.545965757 +0000 UTC m=+1252.158316081" Feb 19 19:39:12 crc kubenswrapper[4722]: I0219 19:39:12.571259 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=3.571238553 podStartE2EDuration="3.571238553s" podCreationTimestamp="2026-02-19 19:39:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:12.564449332 +0000 UTC m=+1252.176799656" watchObservedRunningTime="2026-02-19 19:39:12.571238553 +0000 UTC m=+1252.183588877" Feb 19 19:39:13 crc kubenswrapper[4722]: I0219 19:39:13.087089 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e73e983-eb03-4734-838f-85a759275b7a" path="/var/lib/kubelet/pods/2e73e983-eb03-4734-838f-85a759275b7a/volumes" Feb 19 19:39:13 crc kubenswrapper[4722]: I0219 19:39:13.545033 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="e7d55206-1b8d-4013-a42b-d7e634815929" containerName="cloudkitty-api-log" containerID="cri-o://72589df96ce77377988fc6fdfb157f998e009b737cb3d3fbc7c7fc30136cf951" gracePeriod=30 Feb 19 19:39:13 crc kubenswrapper[4722]: I0219 19:39:13.545392 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="e7d55206-1b8d-4013-a42b-d7e634815929" containerName="cloudkitty-api" containerID="cri-o://ece5fa600d0e9bf914182b73d6050f61408fd3ed43685cff4305bd053c11121a" gracePeriod=30 Feb 19 19:39:13 crc kubenswrapper[4722]: I0219 19:39:13.796483 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 19:39:13 crc kubenswrapper[4722]: I0219 19:39:13.834723 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.501687 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.561853 4722 generic.go:334] "Generic (PLEG): container finished" podID="e7d55206-1b8d-4013-a42b-d7e634815929" containerID="ece5fa600d0e9bf914182b73d6050f61408fd3ed43685cff4305bd053c11121a" exitCode=0 Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.561880 4722 generic.go:334] "Generic (PLEG): container finished" podID="e7d55206-1b8d-4013-a42b-d7e634815929" containerID="72589df96ce77377988fc6fdfb157f998e009b737cb3d3fbc7c7fc30136cf951" exitCode=143 Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.561943 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e7d55206-1b8d-4013-a42b-d7e634815929","Type":"ContainerDied","Data":"ece5fa600d0e9bf914182b73d6050f61408fd3ed43685cff4305bd053c11121a"} Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.561990 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e7d55206-1b8d-4013-a42b-d7e634815929","Type":"ContainerDied","Data":"72589df96ce77377988fc6fdfb157f998e009b737cb3d3fbc7c7fc30136cf951"} Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.563789 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8c8e6512-8007-4e99-8589-8dccb1975e3f","Type":"ContainerStarted","Data":"0c9fd2d9bee615421145ea3e792710d0810ce10840062bfbc82e03a581b134a9"} Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.563927 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" containerName="cinder-scheduler" containerID="cri-o://923d05da4abe9a12313b3a4fc1af83169003ed9d770ecb195f6c8cd32223d17f" gracePeriod=30 Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.563978 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" containerName="probe" containerID="cri-o://9d72b6d3788d72322e0731586a982dd0b1c77afd8b08087faa403a0b4e5395dd" gracePeriod=30 Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.599905 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.599887296 podStartE2EDuration="5.599887296s" podCreationTimestamp="2026-02-19 19:39:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:14.585311712 +0000 UTC m=+1254.197662036" watchObservedRunningTime="2026-02-19 19:39:14.599887296 +0000 UTC m=+1254.212237620" Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.841261 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 19:39:14 crc kubenswrapper[4722]: I0219 19:39:14.923678 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.029093 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.121130 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtbhb\" (UniqueName: \"kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-kube-api-access-jtbhb\") pod \"e7d55206-1b8d-4013-a42b-d7e634815929\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.121223 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d55206-1b8d-4013-a42b-d7e634815929-logs\") pod \"e7d55206-1b8d-4013-a42b-d7e634815929\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.121249 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-certs\") pod \"e7d55206-1b8d-4013-a42b-d7e634815929\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.121268 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data\") pod \"e7d55206-1b8d-4013-a42b-d7e634815929\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.121410 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data-custom\") pod \"e7d55206-1b8d-4013-a42b-d7e634815929\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.121467 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-combined-ca-bundle\") pod \"e7d55206-1b8d-4013-a42b-d7e634815929\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.121513 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-scripts\") pod \"e7d55206-1b8d-4013-a42b-d7e634815929\" (UID: \"e7d55206-1b8d-4013-a42b-d7e634815929\") " Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.127619 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7d55206-1b8d-4013-a42b-d7e634815929-logs" (OuterVolumeSpecName: "logs") pod "e7d55206-1b8d-4013-a42b-d7e634815929" (UID: "e7d55206-1b8d-4013-a42b-d7e634815929"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.155373 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e7d55206-1b8d-4013-a42b-d7e634815929" (UID: "e7d55206-1b8d-4013-a42b-d7e634815929"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.156944 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-kube-api-access-jtbhb" (OuterVolumeSpecName: "kube-api-access-jtbhb") pod "e7d55206-1b8d-4013-a42b-d7e634815929" (UID: "e7d55206-1b8d-4013-a42b-d7e634815929"). InnerVolumeSpecName "kube-api-access-jtbhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.164273 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-scripts" (OuterVolumeSpecName: "scripts") pod "e7d55206-1b8d-4013-a42b-d7e634815929" (UID: "e7d55206-1b8d-4013-a42b-d7e634815929"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.165280 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-certs" (OuterVolumeSpecName: "certs") pod "e7d55206-1b8d-4013-a42b-d7e634815929" (UID: "e7d55206-1b8d-4013-a42b-d7e634815929"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.199403 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7d55206-1b8d-4013-a42b-d7e634815929" (UID: "e7d55206-1b8d-4013-a42b-d7e634815929"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.209699 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data" (OuterVolumeSpecName: "config-data") pod "e7d55206-1b8d-4013-a42b-d7e634815929" (UID: "e7d55206-1b8d-4013-a42b-d7e634815929"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.224074 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.224115 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.224125 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.224134 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtbhb\" (UniqueName: \"kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-kube-api-access-jtbhb\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.224145 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7d55206-1b8d-4013-a42b-d7e634815929-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.224212 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e7d55206-1b8d-4013-a42b-d7e634815929-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.224220 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d55206-1b8d-4013-a42b-d7e634815929-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.573054 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"8676c8db-d85f-44d2-ae94-560542a5cbf3","Type":"ContainerStarted","Data":"d2c2e25d0c7f308559a32682d3e82f4361b90df8632a2f98638b5645ce35f471"} Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.574999 4722 generic.go:334] "Generic (PLEG): container finished" podID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" containerID="9d72b6d3788d72322e0731586a982dd0b1c77afd8b08087faa403a0b4e5395dd" exitCode=0 Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.575018 4722 generic.go:334] "Generic (PLEG): container finished" podID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" containerID="923d05da4abe9a12313b3a4fc1af83169003ed9d770ecb195f6c8cd32223d17f" exitCode=0 Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.575060 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d","Type":"ContainerDied","Data":"9d72b6d3788d72322e0731586a982dd0b1c77afd8b08087faa403a0b4e5395dd"} Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.575078 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d","Type":"ContainerDied","Data":"923d05da4abe9a12313b3a4fc1af83169003ed9d770ecb195f6c8cd32223d17f"} Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.576735 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"e7d55206-1b8d-4013-a42b-d7e634815929","Type":"ContainerDied","Data":"0f03ba3ad2b59fa11ba2712209a1fe901bae7bd74fce14edbb9fe896e87c9bc2"} Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.576793 4722 scope.go:117] "RemoveContainer" containerID="ece5fa600d0e9bf914182b73d6050f61408fd3ed43685cff4305bd053c11121a" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.576746 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.589392 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41000a66-e725-4b1e-ab9c-31251213e311","Type":"ContainerStarted","Data":"e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df"} Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.599525 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.874803734 podStartE2EDuration="6.5994976s" podCreationTimestamp="2026-02-19 19:39:09 +0000 UTC" firstStartedPulling="2026-02-19 19:39:10.927807627 +0000 UTC m=+1250.540157951" lastFinishedPulling="2026-02-19 19:39:14.652501493 +0000 UTC m=+1254.264851817" observedRunningTime="2026-02-19 19:39:15.595655191 +0000 UTC m=+1255.208005515" watchObservedRunningTime="2026-02-19 19:39:15.5994976 +0000 UTC m=+1255.211847924" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.647228 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.708325 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.747559 4722 scope.go:117] "RemoveContainer" containerID="72589df96ce77377988fc6fdfb157f998e009b737cb3d3fbc7c7fc30136cf951" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.769670 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.796227 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:39:15 crc kubenswrapper[4722]: E0219 19:39:15.796729 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e73e983-eb03-4734-838f-85a759275b7a" containerName="dnsmasq-dns" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.796743 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e73e983-eb03-4734-838f-85a759275b7a" containerName="dnsmasq-dns" Feb 19 19:39:15 crc kubenswrapper[4722]: E0219 19:39:15.796758 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e73e983-eb03-4734-838f-85a759275b7a" containerName="init" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.796765 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e73e983-eb03-4734-838f-85a759275b7a" containerName="init" Feb 19 19:39:15 crc kubenswrapper[4722]: E0219 19:39:15.796781 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d55206-1b8d-4013-a42b-d7e634815929" containerName="cloudkitty-api-log" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.796788 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d55206-1b8d-4013-a42b-d7e634815929" containerName="cloudkitty-api-log" Feb 19 19:39:15 crc kubenswrapper[4722]: E0219 19:39:15.796795 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d55206-1b8d-4013-a42b-d7e634815929" containerName="cloudkitty-api" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.796801 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d55206-1b8d-4013-a42b-d7e634815929" containerName="cloudkitty-api" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.797034 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d55206-1b8d-4013-a42b-d7e634815929" containerName="cloudkitty-api-log" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.797056 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d55206-1b8d-4013-a42b-d7e634815929" containerName="cloudkitty-api" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.797066 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e73e983-eb03-4734-838f-85a759275b7a" containerName="dnsmasq-dns" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.798181 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.812078 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.862505704 podStartE2EDuration="9.812056854s" podCreationTimestamp="2026-02-19 19:39:06 +0000 UTC" firstStartedPulling="2026-02-19 19:39:07.700376993 +0000 UTC m=+1247.312727317" lastFinishedPulling="2026-02-19 19:39:14.649928143 +0000 UTC m=+1254.262278467" observedRunningTime="2026-02-19 19:39:15.708606815 +0000 UTC m=+1255.320957139" watchObservedRunningTime="2026-02-19 19:39:15.812056854 +0000 UTC m=+1255.424407178" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.816115 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.816267 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.816372 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.852688 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.944940 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.945019 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.945063 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.945088 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-scripts\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.945120 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.945143 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.945224 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-certs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.945264 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br27h\" (UniqueName: \"kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-kube-api-access-br27h\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:15 crc kubenswrapper[4722]: I0219 19:39:15.945306 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57386acb-6299-4fd3-80a2-25d8769dcc93-logs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.046723 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57386acb-6299-4fd3-80a2-25d8769dcc93-logs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.046820 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.046869 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.046916 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.046942 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-scripts\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.046974 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.046999 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.047070 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-certs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.047127 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br27h\" (UniqueName: \"kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-kube-api-access-br27h\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.047434 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57386acb-6299-4fd3-80a2-25d8769dcc93-logs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.054762 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.054868 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-scripts\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.062025 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-certs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.062313 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.062611 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.064430 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.065919 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.066458 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br27h\" (UniqueName: \"kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-kube-api-access-br27h\") pod \"cloudkitty-api-0\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.162138 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.170279 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.251980 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-scripts\") pod \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.252066 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq4kd\" (UniqueName: \"kubernetes.io/projected/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-kube-api-access-xq4kd\") pod \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.252120 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-combined-ca-bundle\") pod \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.252359 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data\") pod \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.252461 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-etc-machine-id\") pod \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.252531 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data-custom\") pod \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\" (UID: \"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d\") " Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.256142 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" (UID: "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.260424 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" (UID: "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.261305 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-scripts" (OuterVolumeSpecName: "scripts") pod "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" (UID: "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.266910 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-kube-api-access-xq4kd" (OuterVolumeSpecName: "kube-api-access-xq4kd") pod "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" (UID: "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d"). InnerVolumeSpecName "kube-api-access-xq4kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.347924 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" (UID: "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.355432 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.355469 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xq4kd\" (UniqueName: \"kubernetes.io/projected/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-kube-api-access-xq4kd\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.355481 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.355493 4722 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.355502 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.402636 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data" (OuterVolumeSpecName: "config-data") pod "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" (UID: "8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.457820 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.600087 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d","Type":"ContainerDied","Data":"7df30d9bfcde00a2b1fb449d6fdae155a98e793982f413e0d76e3453b4b0afd2"} Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.600100 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.600462 4722 scope.go:117] "RemoveContainer" containerID="9d72b6d3788d72322e0731586a982dd0b1c77afd8b08087faa403a0b4e5395dd" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.601698 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.635265 4722 scope.go:117] "RemoveContainer" containerID="923d05da4abe9a12313b3a4fc1af83169003ed9d770ecb195f6c8cd32223d17f" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.636129 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.652297 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.664382 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 19:39:16 crc kubenswrapper[4722]: E0219 19:39:16.664910 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" containerName="cinder-scheduler" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.664933 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" containerName="cinder-scheduler" Feb 19 19:39:16 crc kubenswrapper[4722]: E0219 19:39:16.664963 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" containerName="probe" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.664972 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" containerName="probe" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.665235 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" containerName="probe" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.665257 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" containerName="cinder-scheduler" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.666587 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.670590 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.678216 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.756215 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:39:16 crc kubenswrapper[4722]: W0219 19:39:16.761354 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57386acb_6299_4fd3_80a2_25d8769dcc93.slice/crio-42972e824b9c0da7ea1f6a0d1a02b3318f196426677f7d390159e0bf2aae2802 WatchSource:0}: Error finding container 42972e824b9c0da7ea1f6a0d1a02b3318f196426677f7d390159e0bf2aae2802: Status 404 returned error can't find the container with id 42972e824b9c0da7ea1f6a0d1a02b3318f196426677f7d390159e0bf2aae2802 Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.765576 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rszkk\" (UniqueName: \"kubernetes.io/projected/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-kube-api-access-rszkk\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.765923 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.765988 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.767046 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-scripts\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.767083 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-config-data\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.767103 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.868722 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-scripts\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.868781 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-config-data\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.868817 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.868882 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rszkk\" (UniqueName: \"kubernetes.io/projected/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-kube-api-access-rszkk\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.868982 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.869029 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.869205 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.874339 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-config-data\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.875245 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.875367 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-scripts\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.889666 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:16 crc kubenswrapper[4722]: I0219 19:39:16.897729 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rszkk\" (UniqueName: \"kubernetes.io/projected/afcc30d0-b94c-4bf7-8736-fb35bc461fa2-kube-api-access-rszkk\") pod \"cinder-scheduler-0\" (UID: \"afcc30d0-b94c-4bf7-8736-fb35bc461fa2\") " pod="openstack/cinder-scheduler-0" Feb 19 19:39:17 crc kubenswrapper[4722]: I0219 19:39:17.053008 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 19:39:17 crc kubenswrapper[4722]: I0219 19:39:17.117192 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d" path="/var/lib/kubelet/pods/8ddbcecf-f0d2-46f1-b959-e5fe1d163b4d/volumes" Feb 19 19:39:17 crc kubenswrapper[4722]: I0219 19:39:17.118044 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7d55206-1b8d-4013-a42b-d7e634815929" path="/var/lib/kubelet/pods/e7d55206-1b8d-4013-a42b-d7e634815929/volumes" Feb 19 19:39:17 crc kubenswrapper[4722]: I0219 19:39:17.628326 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"57386acb-6299-4fd3-80a2-25d8769dcc93","Type":"ContainerStarted","Data":"fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971"} Feb 19 19:39:17 crc kubenswrapper[4722]: I0219 19:39:17.628793 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"57386acb-6299-4fd3-80a2-25d8769dcc93","Type":"ContainerStarted","Data":"35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422"} Feb 19 19:39:17 crc kubenswrapper[4722]: I0219 19:39:17.628821 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"57386acb-6299-4fd3-80a2-25d8769dcc93","Type":"ContainerStarted","Data":"42972e824b9c0da7ea1f6a0d1a02b3318f196426677f7d390159e0bf2aae2802"} Feb 19 19:39:17 crc kubenswrapper[4722]: I0219 19:39:17.628857 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 19 19:39:17 crc kubenswrapper[4722]: I0219 19:39:17.628963 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="8676c8db-d85f-44d2-ae94-560542a5cbf3" containerName="cloudkitty-proc" containerID="cri-o://d2c2e25d0c7f308559a32682d3e82f4361b90df8632a2f98638b5645ce35f471" gracePeriod=30 Feb 19 19:39:17 crc kubenswrapper[4722]: I0219 19:39:17.660475 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.660460258 podStartE2EDuration="2.660460258s" podCreationTimestamp="2026-02-19 19:39:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:17.657475256 +0000 UTC m=+1257.269825580" watchObservedRunningTime="2026-02-19 19:39:17.660460258 +0000 UTC m=+1257.272810582" Feb 19 19:39:17 crc kubenswrapper[4722]: I0219 19:39:17.798507 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 19:39:18 crc kubenswrapper[4722]: I0219 19:39:18.646785 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"afcc30d0-b94c-4bf7-8736-fb35bc461fa2","Type":"ContainerStarted","Data":"d1764b20f34ed6d85d0d47b8b2899873a0c19b1fb3f82cf9b3b2d74b7a687bc6"} Feb 19 19:39:18 crc kubenswrapper[4722]: I0219 19:39:18.647274 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"afcc30d0-b94c-4bf7-8736-fb35bc461fa2","Type":"ContainerStarted","Data":"caafa2143ff64269289e010c971ce0519a5814d460892533e177e0083afeaebe"} Feb 19 19:39:19 crc kubenswrapper[4722]: I0219 19:39:19.668587 4722 generic.go:334] "Generic (PLEG): container finished" podID="8676c8db-d85f-44d2-ae94-560542a5cbf3" containerID="d2c2e25d0c7f308559a32682d3e82f4361b90df8632a2f98638b5645ce35f471" exitCode=0 Feb 19 19:39:19 crc kubenswrapper[4722]: I0219 19:39:19.668956 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"8676c8db-d85f-44d2-ae94-560542a5cbf3","Type":"ContainerDied","Data":"d2c2e25d0c7f308559a32682d3e82f4361b90df8632a2f98638b5645ce35f471"} Feb 19 19:39:19 crc kubenswrapper[4722]: I0219 19:39:19.684326 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"afcc30d0-b94c-4bf7-8736-fb35bc461fa2","Type":"ContainerStarted","Data":"b3b3e2358eab6f69738470fbb9c2f2d49b4d091b93d31cbbe4fde26000e442b3"} Feb 19 19:39:19 crc kubenswrapper[4722]: I0219 19:39:19.727930 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.727905399 podStartE2EDuration="3.727905399s" podCreationTimestamp="2026-02-19 19:39:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:19.719228689 +0000 UTC m=+1259.331579053" watchObservedRunningTime="2026-02-19 19:39:19.727905399 +0000 UTC m=+1259.340255723" Feb 19 19:39:19 crc kubenswrapper[4722]: I0219 19:39:19.975655 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 19 19:39:19 crc kubenswrapper[4722]: I0219 19:39:19.990311 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.076361 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-hcfgw"] Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.076673 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" podUID="f618be57-2b9f-4455-8de0-90379bc9d57b" containerName="dnsmasq-dns" containerID="cri-o://0546c702603104f43bbaaf99f3fe718c40fad148666fb0d4d8b70707d6802f06" gracePeriod=10 Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.101295 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data\") pod \"8676c8db-d85f-44d2-ae94-560542a5cbf3\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.101344 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-scripts\") pod \"8676c8db-d85f-44d2-ae94-560542a5cbf3\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.101555 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-combined-ca-bundle\") pod \"8676c8db-d85f-44d2-ae94-560542a5cbf3\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.101601 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data-custom\") pod \"8676c8db-d85f-44d2-ae94-560542a5cbf3\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.101659 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p26ng\" (UniqueName: \"kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-kube-api-access-p26ng\") pod \"8676c8db-d85f-44d2-ae94-560542a5cbf3\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.101761 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-certs\") pod \"8676c8db-d85f-44d2-ae94-560542a5cbf3\" (UID: \"8676c8db-d85f-44d2-ae94-560542a5cbf3\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.119334 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-kube-api-access-p26ng" (OuterVolumeSpecName: "kube-api-access-p26ng") pod "8676c8db-d85f-44d2-ae94-560542a5cbf3" (UID: "8676c8db-d85f-44d2-ae94-560542a5cbf3"). InnerVolumeSpecName "kube-api-access-p26ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.124691 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-certs" (OuterVolumeSpecName: "certs") pod "8676c8db-d85f-44d2-ae94-560542a5cbf3" (UID: "8676c8db-d85f-44d2-ae94-560542a5cbf3"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.125325 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8676c8db-d85f-44d2-ae94-560542a5cbf3" (UID: "8676c8db-d85f-44d2-ae94-560542a5cbf3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.134315 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-scripts" (OuterVolumeSpecName: "scripts") pod "8676c8db-d85f-44d2-ae94-560542a5cbf3" (UID: "8676c8db-d85f-44d2-ae94-560542a5cbf3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.197255 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8676c8db-d85f-44d2-ae94-560542a5cbf3" (UID: "8676c8db-d85f-44d2-ae94-560542a5cbf3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.203970 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.204000 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.204009 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p26ng\" (UniqueName: \"kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-kube-api-access-p26ng\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.204018 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8676c8db-d85f-44d2-ae94-560542a5cbf3-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.204027 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.216815 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data" (OuterVolumeSpecName: "config-data") pod "8676c8db-d85f-44d2-ae94-560542a5cbf3" (UID: "8676c8db-d85f-44d2-ae94-560542a5cbf3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.308135 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8676c8db-d85f-44d2-ae94-560542a5cbf3-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: E0219 19:39:20.563724 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf618be57_2b9f_4455_8de0_90379bc9d57b.slice/crio-0546c702603104f43bbaaf99f3fe718c40fad148666fb0d4d8b70707d6802f06.scope\": RecentStats: unable to find data in memory cache]" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.696469 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"8676c8db-d85f-44d2-ae94-560542a5cbf3","Type":"ContainerDied","Data":"cf89ffbf474dc1e5f2a9ec1d323956a0406b1e6ce7a0ddc5729131b991819812"} Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.696558 4722 scope.go:117] "RemoveContainer" containerID="d2c2e25d0c7f308559a32682d3e82f4361b90df8632a2f98638b5645ce35f471" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.697787 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.701136 4722 generic.go:334] "Generic (PLEG): container finished" podID="f618be57-2b9f-4455-8de0-90379bc9d57b" containerID="0546c702603104f43bbaaf99f3fe718c40fad148666fb0d4d8b70707d6802f06" exitCode=0 Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.701205 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" event={"ID":"f618be57-2b9f-4455-8de0-90379bc9d57b","Type":"ContainerDied","Data":"0546c702603104f43bbaaf99f3fe718c40fad148666fb0d4d8b70707d6802f06"} Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.701243 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" event={"ID":"f618be57-2b9f-4455-8de0-90379bc9d57b","Type":"ContainerDied","Data":"b45faf8bb73a0e07ec3500177daa08ffabc04115f5244bdef2acc1c1f815aaea"} Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.701260 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b45faf8bb73a0e07ec3500177daa08ffabc04115f5244bdef2acc1c1f815aaea" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.760694 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.779938 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.815779 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.826901 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:39:20 crc kubenswrapper[4722]: E0219 19:39:20.827421 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8676c8db-d85f-44d2-ae94-560542a5cbf3" containerName="cloudkitty-proc" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.827443 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8676c8db-d85f-44d2-ae94-560542a5cbf3" containerName="cloudkitty-proc" Feb 19 19:39:20 crc kubenswrapper[4722]: E0219 19:39:20.827467 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f618be57-2b9f-4455-8de0-90379bc9d57b" containerName="dnsmasq-dns" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.827476 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f618be57-2b9f-4455-8de0-90379bc9d57b" containerName="dnsmasq-dns" Feb 19 19:39:20 crc kubenswrapper[4722]: E0219 19:39:20.827494 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f618be57-2b9f-4455-8de0-90379bc9d57b" containerName="init" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.827501 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f618be57-2b9f-4455-8de0-90379bc9d57b" containerName="init" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.827788 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f618be57-2b9f-4455-8de0-90379bc9d57b" containerName="dnsmasq-dns" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.827812 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8676c8db-d85f-44d2-ae94-560542a5cbf3" containerName="cloudkitty-proc" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.828652 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.830784 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.839452 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-config\") pod \"f618be57-2b9f-4455-8de0-90379bc9d57b\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.839499 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsg42\" (UniqueName: \"kubernetes.io/projected/f618be57-2b9f-4455-8de0-90379bc9d57b-kube-api-access-jsg42\") pod \"f618be57-2b9f-4455-8de0-90379bc9d57b\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.839525 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-sb\") pod \"f618be57-2b9f-4455-8de0-90379bc9d57b\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.839621 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-nb\") pod \"f618be57-2b9f-4455-8de0-90379bc9d57b\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.839672 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-svc\") pod \"f618be57-2b9f-4455-8de0-90379bc9d57b\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.839693 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-swift-storage-0\") pod \"f618be57-2b9f-4455-8de0-90379bc9d57b\" (UID: \"f618be57-2b9f-4455-8de0-90379bc9d57b\") " Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.847463 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f618be57-2b9f-4455-8de0-90379bc9d57b-kube-api-access-jsg42" (OuterVolumeSpecName: "kube-api-access-jsg42") pod "f618be57-2b9f-4455-8de0-90379bc9d57b" (UID: "f618be57-2b9f-4455-8de0-90379bc9d57b"). InnerVolumeSpecName "kube-api-access-jsg42". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.860891 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.904238 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f618be57-2b9f-4455-8de0-90379bc9d57b" (UID: "f618be57-2b9f-4455-8de0-90379bc9d57b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.919047 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f618be57-2b9f-4455-8de0-90379bc9d57b" (UID: "f618be57-2b9f-4455-8de0-90379bc9d57b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.923463 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f618be57-2b9f-4455-8de0-90379bc9d57b" (UID: "f618be57-2b9f-4455-8de0-90379bc9d57b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.934476 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f618be57-2b9f-4455-8de0-90379bc9d57b" (UID: "f618be57-2b9f-4455-8de0-90379bc9d57b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.941404 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.941646 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw245\" (UniqueName: \"kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-kube-api-access-zw245\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.941856 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.942039 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.942082 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-scripts\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.942280 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-certs\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.942421 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsg42\" (UniqueName: \"kubernetes.io/projected/f618be57-2b9f-4455-8de0-90379bc9d57b-kube-api-access-jsg42\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.942495 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.942558 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.942614 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.942669 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:20 crc kubenswrapper[4722]: I0219 19:39:20.945362 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-config" (OuterVolumeSpecName: "config") pod "f618be57-2b9f-4455-8de0-90379bc9d57b" (UID: "f618be57-2b9f-4455-8de0-90379bc9d57b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.045238 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.045354 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.045380 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-scripts\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.045549 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-certs\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.045698 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.045789 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw245\" (UniqueName: \"kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-kube-api-access-zw245\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.045942 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f618be57-2b9f-4455-8de0-90379bc9d57b-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.048802 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.049385 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.049741 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.054186 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-certs\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.059613 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-scripts\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.060587 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.065625 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw245\" (UniqueName: \"kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-kube-api-access-zw245\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.065828 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.093648 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8676c8db-d85f-44d2-ae94-560542a5cbf3" path="/var/lib/kubelet/pods/8676c8db-d85f-44d2-ae94-560542a5cbf3/volumes" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.172764 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.311944 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-546c4d4684-6vk7j" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.389798 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-59d6bc9fcb-2t849"] Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.390012 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-59d6bc9fcb-2t849" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerName="barbican-api-log" containerID="cri-o://deec74702da9c72730adb9e092792817648059be9727bcc7760a0aa5c428553c" gracePeriod=30 Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.390442 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-59d6bc9fcb-2t849" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerName="barbican-api" containerID="cri-o://9f6bb518f1ba765c7a7052f429020ccd643361e2ca7e80330aa450dd36d72a26" gracePeriod=30 Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.681278 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:39:21 crc kubenswrapper[4722]: W0219 19:39:21.683667 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00bbae7e_ebc6_4102_9398_fc131546bbf5.slice/crio-fc84223d282573fb3fb01a61be1e1be06c5fe3404b335fffa4163163f1c67edb WatchSource:0}: Error finding container fc84223d282573fb3fb01a61be1e1be06c5fe3404b335fffa4163163f1c67edb: Status 404 returned error can't find the container with id fc84223d282573fb3fb01a61be1e1be06c5fe3404b335fffa4163163f1c67edb Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.720414 4722 generic.go:334] "Generic (PLEG): container finished" podID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerID="deec74702da9c72730adb9e092792817648059be9727bcc7760a0aa5c428553c" exitCode=143 Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.720499 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d6bc9fcb-2t849" event={"ID":"da70c61d-7b82-48ee-bce0-53e96df3442d","Type":"ContainerDied","Data":"deec74702da9c72730adb9e092792817648059be9727bcc7760a0aa5c428553c"} Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.725797 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-hcfgw" Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.726985 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"00bbae7e-ebc6-4102-9398-fc131546bbf5","Type":"ContainerStarted","Data":"fc84223d282573fb3fb01a61be1e1be06c5fe3404b335fffa4163163f1c67edb"} Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.790217 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-hcfgw"] Feb 19 19:39:21 crc kubenswrapper[4722]: I0219 19:39:21.802951 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-hcfgw"] Feb 19 19:39:22 crc kubenswrapper[4722]: I0219 19:39:22.054550 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 19:39:22 crc kubenswrapper[4722]: I0219 19:39:22.595405 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 19 19:39:22 crc kubenswrapper[4722]: I0219 19:39:22.751334 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"00bbae7e-ebc6-4102-9398-fc131546bbf5","Type":"ContainerStarted","Data":"20d63437963fbb92aa14a89d0ac3100abcdfca03a493c9976283dbcbad9c2d7e"} Feb 19 19:39:22 crc kubenswrapper[4722]: I0219 19:39:22.785673 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.7856581030000003 podStartE2EDuration="2.785658103s" podCreationTimestamp="2026-02-19 19:39:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:22.773808075 +0000 UTC m=+1262.386158399" watchObservedRunningTime="2026-02-19 19:39:22.785658103 +0000 UTC m=+1262.398008427" Feb 19 19:39:23 crc kubenswrapper[4722]: I0219 19:39:23.083744 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f618be57-2b9f-4455-8de0-90379bc9d57b" path="/var/lib/kubelet/pods/f618be57-2b9f-4455-8de0-90379bc9d57b/volumes" Feb 19 19:39:23 crc kubenswrapper[4722]: I0219 19:39:23.340420 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:39:23 crc kubenswrapper[4722]: I0219 19:39:23.462036 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:39:23 crc kubenswrapper[4722]: I0219 19:39:23.481547 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:39:23 crc kubenswrapper[4722]: I0219 19:39:23.857540 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7cc7c8879d-tnbfs" Feb 19 19:39:23 crc kubenswrapper[4722]: I0219 19:39:23.942126 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7cc6894556-2r5j6"] Feb 19 19:39:24 crc kubenswrapper[4722]: I0219 19:39:24.388377 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7cb5f76f4-hx5jh" Feb 19 19:39:24 crc kubenswrapper[4722]: I0219 19:39:24.599885 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-59d6bc9fcb-2t849" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": read tcp 10.217.0.2:48414->10.217.0.178:9311: read: connection reset by peer" Feb 19 19:39:24 crc kubenswrapper[4722]: I0219 19:39:24.599899 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-59d6bc9fcb-2t849" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": read tcp 10.217.0.2:48400->10.217.0.178:9311: read: connection reset by peer" Feb 19 19:39:24 crc kubenswrapper[4722]: I0219 19:39:24.774073 4722 generic.go:334] "Generic (PLEG): container finished" podID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerID="9f6bb518f1ba765c7a7052f429020ccd643361e2ca7e80330aa450dd36d72a26" exitCode=0 Feb 19 19:39:24 crc kubenswrapper[4722]: I0219 19:39:24.774311 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7cc6894556-2r5j6" podUID="e0e1ecfc-6394-4815-bf10-7623a5359525" containerName="placement-log" containerID="cri-o://c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df" gracePeriod=30 Feb 19 19:39:24 crc kubenswrapper[4722]: I0219 19:39:24.774586 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d6bc9fcb-2t849" event={"ID":"da70c61d-7b82-48ee-bce0-53e96df3442d","Type":"ContainerDied","Data":"9f6bb518f1ba765c7a7052f429020ccd643361e2ca7e80330aa450dd36d72a26"} Feb 19 19:39:24 crc kubenswrapper[4722]: I0219 19:39:24.774880 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7cc6894556-2r5j6" podUID="e0e1ecfc-6394-4815-bf10-7623a5359525" containerName="placement-api" containerID="cri-o://95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d" gracePeriod=30 Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.369385 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.461638 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da70c61d-7b82-48ee-bce0-53e96df3442d-logs\") pod \"da70c61d-7b82-48ee-bce0-53e96df3442d\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.461788 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlq7v\" (UniqueName: \"kubernetes.io/projected/da70c61d-7b82-48ee-bce0-53e96df3442d-kube-api-access-tlq7v\") pod \"da70c61d-7b82-48ee-bce0-53e96df3442d\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.461885 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data\") pod \"da70c61d-7b82-48ee-bce0-53e96df3442d\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.461922 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-combined-ca-bundle\") pod \"da70c61d-7b82-48ee-bce0-53e96df3442d\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.461951 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data-custom\") pod \"da70c61d-7b82-48ee-bce0-53e96df3442d\" (UID: \"da70c61d-7b82-48ee-bce0-53e96df3442d\") " Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.462081 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da70c61d-7b82-48ee-bce0-53e96df3442d-logs" (OuterVolumeSpecName: "logs") pod "da70c61d-7b82-48ee-bce0-53e96df3442d" (UID: "da70c61d-7b82-48ee-bce0-53e96df3442d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.463367 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da70c61d-7b82-48ee-bce0-53e96df3442d-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.469289 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "da70c61d-7b82-48ee-bce0-53e96df3442d" (UID: "da70c61d-7b82-48ee-bce0-53e96df3442d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.487400 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da70c61d-7b82-48ee-bce0-53e96df3442d-kube-api-access-tlq7v" (OuterVolumeSpecName: "kube-api-access-tlq7v") pod "da70c61d-7b82-48ee-bce0-53e96df3442d" (UID: "da70c61d-7b82-48ee-bce0-53e96df3442d"). InnerVolumeSpecName "kube-api-access-tlq7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.525392 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data" (OuterVolumeSpecName: "config-data") pod "da70c61d-7b82-48ee-bce0-53e96df3442d" (UID: "da70c61d-7b82-48ee-bce0-53e96df3442d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.528848 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da70c61d-7b82-48ee-bce0-53e96df3442d" (UID: "da70c61d-7b82-48ee-bce0-53e96df3442d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.565190 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.565229 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.565244 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da70c61d-7b82-48ee-bce0-53e96df3442d-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.565257 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlq7v\" (UniqueName: \"kubernetes.io/projected/da70c61d-7b82-48ee-bce0-53e96df3442d-kube-api-access-tlq7v\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.808292 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d6bc9fcb-2t849" event={"ID":"da70c61d-7b82-48ee-bce0-53e96df3442d","Type":"ContainerDied","Data":"242a8544a68d9d74ed6eb73bcacefdebb2fd4ae624de878f66d716d08691a8be"} Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.808439 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59d6bc9fcb-2t849" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.808562 4722 scope.go:117] "RemoveContainer" containerID="9f6bb518f1ba765c7a7052f429020ccd643361e2ca7e80330aa450dd36d72a26" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.814470 4722 generic.go:334] "Generic (PLEG): container finished" podID="e0e1ecfc-6394-4815-bf10-7623a5359525" containerID="c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df" exitCode=143 Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.814502 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cc6894556-2r5j6" event={"ID":"e0e1ecfc-6394-4815-bf10-7623a5359525","Type":"ContainerDied","Data":"c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df"} Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.847395 4722 scope.go:117] "RemoveContainer" containerID="deec74702da9c72730adb9e092792817648059be9727bcc7760a0aa5c428553c" Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.854025 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-59d6bc9fcb-2t849"] Feb 19 19:39:25 crc kubenswrapper[4722]: I0219 19:39:25.860979 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-59d6bc9fcb-2t849"] Feb 19 19:39:27 crc kubenswrapper[4722]: I0219 19:39:27.086349 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" path="/var/lib/kubelet/pods/da70c61d-7b82-48ee-bce0-53e96df3442d/volumes" Feb 19 19:39:27 crc kubenswrapper[4722]: I0219 19:39:27.271494 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.466134 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.527130 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e1ecfc-6394-4815-bf10-7623a5359525-logs\") pod \"e0e1ecfc-6394-4815-bf10-7623a5359525\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.527216 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-combined-ca-bundle\") pod \"e0e1ecfc-6394-4815-bf10-7623a5359525\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.527257 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x57pr\" (UniqueName: \"kubernetes.io/projected/e0e1ecfc-6394-4815-bf10-7623a5359525-kube-api-access-x57pr\") pod \"e0e1ecfc-6394-4815-bf10-7623a5359525\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.527326 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-public-tls-certs\") pod \"e0e1ecfc-6394-4815-bf10-7623a5359525\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.527414 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-config-data\") pod \"e0e1ecfc-6394-4815-bf10-7623a5359525\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.527493 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-scripts\") pod \"e0e1ecfc-6394-4815-bf10-7623a5359525\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.527556 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-internal-tls-certs\") pod \"e0e1ecfc-6394-4815-bf10-7623a5359525\" (UID: \"e0e1ecfc-6394-4815-bf10-7623a5359525\") " Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.527618 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0e1ecfc-6394-4815-bf10-7623a5359525-logs" (OuterVolumeSpecName: "logs") pod "e0e1ecfc-6394-4815-bf10-7623a5359525" (UID: "e0e1ecfc-6394-4815-bf10-7623a5359525"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.528075 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0e1ecfc-6394-4815-bf10-7623a5359525-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.551145 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0e1ecfc-6394-4815-bf10-7623a5359525-kube-api-access-x57pr" (OuterVolumeSpecName: "kube-api-access-x57pr") pod "e0e1ecfc-6394-4815-bf10-7623a5359525" (UID: "e0e1ecfc-6394-4815-bf10-7623a5359525"). InnerVolumeSpecName "kube-api-access-x57pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.552118 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-scripts" (OuterVolumeSpecName: "scripts") pod "e0e1ecfc-6394-4815-bf10-7623a5359525" (UID: "e0e1ecfc-6394-4815-bf10-7623a5359525"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.584625 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-config-data" (OuterVolumeSpecName: "config-data") pod "e0e1ecfc-6394-4815-bf10-7623a5359525" (UID: "e0e1ecfc-6394-4815-bf10-7623a5359525"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.598259 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0e1ecfc-6394-4815-bf10-7623a5359525" (UID: "e0e1ecfc-6394-4815-bf10-7623a5359525"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.630145 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.630203 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.630213 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.630225 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x57pr\" (UniqueName: \"kubernetes.io/projected/e0e1ecfc-6394-4815-bf10-7623a5359525-kube-api-access-x57pr\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.653350 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e0e1ecfc-6394-4815-bf10-7623a5359525" (UID: "e0e1ecfc-6394-4815-bf10-7623a5359525"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.663318 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 19:39:28 crc kubenswrapper[4722]: E0219 19:39:28.664066 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0e1ecfc-6394-4815-bf10-7623a5359525" containerName="placement-log" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.664212 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e1ecfc-6394-4815-bf10-7623a5359525" containerName="placement-log" Feb 19 19:39:28 crc kubenswrapper[4722]: E0219 19:39:28.664311 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerName="barbican-api-log" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.664398 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerName="barbican-api-log" Feb 19 19:39:28 crc kubenswrapper[4722]: E0219 19:39:28.664491 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0e1ecfc-6394-4815-bf10-7623a5359525" containerName="placement-api" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.664569 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e1ecfc-6394-4815-bf10-7623a5359525" containerName="placement-api" Feb 19 19:39:28 crc kubenswrapper[4722]: E0219 19:39:28.664688 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerName="barbican-api" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.664776 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerName="barbican-api" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.665168 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerName="barbican-api" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.665267 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0e1ecfc-6394-4815-bf10-7623a5359525" containerName="placement-log" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.665356 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="da70c61d-7b82-48ee-bce0-53e96df3442d" containerName="barbican-api-log" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.665442 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0e1ecfc-6394-4815-bf10-7623a5359525" containerName="placement-api" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.666481 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.668301 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e0e1ecfc-6394-4815-bf10-7623a5359525" (UID: "e0e1ecfc-6394-4815-bf10-7623a5359525"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.669125 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-czh7m" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.669396 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.669431 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.675807 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.732343 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.732693 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.732879 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config-secret\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.733183 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwssj\" (UniqueName: \"kubernetes.io/projected/54745880-0d6d-432b-be90-a609a4f4bff6-kube-api-access-gwssj\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.733373 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.733389 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e1ecfc-6394-4815-bf10-7623a5359525-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.834523 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwssj\" (UniqueName: \"kubernetes.io/projected/54745880-0d6d-432b-be90-a609a4f4bff6-kube-api-access-gwssj\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.834589 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.834692 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.834740 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config-secret\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.835649 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.838645 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.842627 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config-secret\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.848292 4722 generic.go:334] "Generic (PLEG): container finished" podID="e0e1ecfc-6394-4815-bf10-7623a5359525" containerID="95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d" exitCode=0 Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.848327 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cc6894556-2r5j6" event={"ID":"e0e1ecfc-6394-4815-bf10-7623a5359525","Type":"ContainerDied","Data":"95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d"} Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.848352 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cc6894556-2r5j6" event={"ID":"e0e1ecfc-6394-4815-bf10-7623a5359525","Type":"ContainerDied","Data":"c688d661fba42ba2a53e010b04af9f22dbacb7137f02c088f90b0645fc7ab228"} Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.848367 4722 scope.go:117] "RemoveContainer" containerID="95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.848457 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7cc6894556-2r5j6" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.856122 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwssj\" (UniqueName: \"kubernetes.io/projected/54745880-0d6d-432b-be90-a609a4f4bff6-kube-api-access-gwssj\") pod \"openstackclient\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.895993 4722 scope.go:117] "RemoveContainer" containerID="c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.896491 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7cc6894556-2r5j6"] Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.915816 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7cc6894556-2r5j6"] Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.924643 4722 scope.go:117] "RemoveContainer" containerID="95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d" Feb 19 19:39:28 crc kubenswrapper[4722]: E0219 19:39:28.925318 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d\": container with ID starting with 95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d not found: ID does not exist" containerID="95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.925517 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d"} err="failed to get container status \"95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d\": rpc error: code = NotFound desc = could not find container \"95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d\": container with ID starting with 95e43f606ceb068557b9f3535466fc8632b44032e33630d3ac3c38c32c5cac9d not found: ID does not exist" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.925676 4722 scope.go:117] "RemoveContainer" containerID="c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df" Feb 19 19:39:28 crc kubenswrapper[4722]: E0219 19:39:28.926231 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df\": container with ID starting with c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df not found: ID does not exist" containerID="c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.926411 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df"} err="failed to get container status \"c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df\": rpc error: code = NotFound desc = could not find container \"c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df\": container with ID starting with c847cd020847269c8e575551f5f89c0bb64780f54df196aa8783ab91af0404df not found: ID does not exist" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.948463 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.949879 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 19:39:28 crc kubenswrapper[4722]: I0219 19:39:28.959611 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.013874 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.017638 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.062967 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.085872 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0e1ecfc-6394-4815-bf10-7623a5359525" path="/var/lib/kubelet/pods/e0e1ecfc-6394-4815-bf10-7623a5359525/volumes" Feb 19 19:39:29 crc kubenswrapper[4722]: E0219 19:39:29.138244 4722 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 19 19:39:29 crc kubenswrapper[4722]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_54745880-0d6d-432b-be90-a609a4f4bff6_0(6dd2552ba69e95ca517da0b5e5af30380d18d6410664cb7e2dafdbb43d9c5f2e): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6dd2552ba69e95ca517da0b5e5af30380d18d6410664cb7e2dafdbb43d9c5f2e" Netns:"/var/run/netns/f605bd00-826b-4a6a-99e5-c59bf30ac4e0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=6dd2552ba69e95ca517da0b5e5af30380d18d6410664cb7e2dafdbb43d9c5f2e;K8S_POD_UID=54745880-0d6d-432b-be90-a609a4f4bff6" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/54745880-0d6d-432b-be90-a609a4f4bff6]: expected pod UID "54745880-0d6d-432b-be90-a609a4f4bff6" but got "af557f35-ca9e-4990-bdcb-9e44366dab68" from Kube API Feb 19 19:39:29 crc kubenswrapper[4722]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 19 19:39:29 crc kubenswrapper[4722]: > Feb 19 19:39:29 crc kubenswrapper[4722]: E0219 19:39:29.138315 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 19 19:39:29 crc kubenswrapper[4722]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_54745880-0d6d-432b-be90-a609a4f4bff6_0(6dd2552ba69e95ca517da0b5e5af30380d18d6410664cb7e2dafdbb43d9c5f2e): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"6dd2552ba69e95ca517da0b5e5af30380d18d6410664cb7e2dafdbb43d9c5f2e" Netns:"/var/run/netns/f605bd00-826b-4a6a-99e5-c59bf30ac4e0" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=6dd2552ba69e95ca517da0b5e5af30380d18d6410664cb7e2dafdbb43d9c5f2e;K8S_POD_UID=54745880-0d6d-432b-be90-a609a4f4bff6" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/54745880-0d6d-432b-be90-a609a4f4bff6]: expected pod UID "54745880-0d6d-432b-be90-a609a4f4bff6" but got "af557f35-ca9e-4990-bdcb-9e44366dab68" from Kube API Feb 19 19:39:29 crc kubenswrapper[4722]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 19 19:39:29 crc kubenswrapper[4722]: > pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.146141 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmg9j\" (UniqueName: \"kubernetes.io/projected/af557f35-ca9e-4990-bdcb-9e44366dab68-kube-api-access-lmg9j\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.146259 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/af557f35-ca9e-4990-bdcb-9e44366dab68-openstack-config\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.146512 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/af557f35-ca9e-4990-bdcb-9e44366dab68-openstack-config-secret\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.146669 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af557f35-ca9e-4990-bdcb-9e44366dab68-combined-ca-bundle\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.248499 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/af557f35-ca9e-4990-bdcb-9e44366dab68-openstack-config\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.248554 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/af557f35-ca9e-4990-bdcb-9e44366dab68-openstack-config-secret\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.248668 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af557f35-ca9e-4990-bdcb-9e44366dab68-combined-ca-bundle\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.248743 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmg9j\" (UniqueName: \"kubernetes.io/projected/af557f35-ca9e-4990-bdcb-9e44366dab68-kube-api-access-lmg9j\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.249585 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/af557f35-ca9e-4990-bdcb-9e44366dab68-openstack-config\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.252986 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af557f35-ca9e-4990-bdcb-9e44366dab68-combined-ca-bundle\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.253542 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/af557f35-ca9e-4990-bdcb-9e44366dab68-openstack-config-secret\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.266033 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmg9j\" (UniqueName: \"kubernetes.io/projected/af557f35-ca9e-4990-bdcb-9e44366dab68-kube-api-access-lmg9j\") pod \"openstackclient\" (UID: \"af557f35-ca9e-4990-bdcb-9e44366dab68\") " pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.507801 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.858724 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.870610 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.877814 4722 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="54745880-0d6d-432b-be90-a609a4f4bff6" podUID="af557f35-ca9e-4990-bdcb-9e44366dab68" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.962057 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-combined-ca-bundle\") pod \"54745880-0d6d-432b-be90-a609a4f4bff6\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.962253 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config\") pod \"54745880-0d6d-432b-be90-a609a4f4bff6\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.962312 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config-secret\") pod \"54745880-0d6d-432b-be90-a609a4f4bff6\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.962462 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwssj\" (UniqueName: \"kubernetes.io/projected/54745880-0d6d-432b-be90-a609a4f4bff6-kube-api-access-gwssj\") pod \"54745880-0d6d-432b-be90-a609a4f4bff6\" (UID: \"54745880-0d6d-432b-be90-a609a4f4bff6\") " Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.963132 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "54745880-0d6d-432b-be90-a609a4f4bff6" (UID: "54745880-0d6d-432b-be90-a609a4f4bff6"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.968731 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54745880-0d6d-432b-be90-a609a4f4bff6-kube-api-access-gwssj" (OuterVolumeSpecName: "kube-api-access-gwssj") pod "54745880-0d6d-432b-be90-a609a4f4bff6" (UID: "54745880-0d6d-432b-be90-a609a4f4bff6"). InnerVolumeSpecName "kube-api-access-gwssj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.969412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54745880-0d6d-432b-be90-a609a4f4bff6" (UID: "54745880-0d6d-432b-be90-a609a4f4bff6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.971288 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "54745880-0d6d-432b-be90-a609a4f4bff6" (UID: "54745880-0d6d-432b-be90-a609a4f4bff6"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:29 crc kubenswrapper[4722]: W0219 19:39:29.989349 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf557f35_ca9e_4990_bdcb_9e44366dab68.slice/crio-ae0266c24071345a896416aae5ec1b06eebb519adee9ba5174233ed20c6975c7 WatchSource:0}: Error finding container ae0266c24071345a896416aae5ec1b06eebb519adee9ba5174233ed20c6975c7: Status 404 returned error can't find the container with id ae0266c24071345a896416aae5ec1b06eebb519adee9ba5174233ed20c6975c7 Feb 19 19:39:29 crc kubenswrapper[4722]: I0219 19:39:29.993641 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 19:39:30 crc kubenswrapper[4722]: I0219 19:39:30.065561 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:30 crc kubenswrapper[4722]: I0219 19:39:30.065600 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:30 crc kubenswrapper[4722]: I0219 19:39:30.065613 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwssj\" (UniqueName: \"kubernetes.io/projected/54745880-0d6d-432b-be90-a609a4f4bff6-kube-api-access-gwssj\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:30 crc kubenswrapper[4722]: I0219 19:39:30.065625 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54745880-0d6d-432b-be90-a609a4f4bff6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:30 crc kubenswrapper[4722]: I0219 19:39:30.869768 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 19:39:30 crc kubenswrapper[4722]: I0219 19:39:30.871333 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"af557f35-ca9e-4990-bdcb-9e44366dab68","Type":"ContainerStarted","Data":"ae0266c24071345a896416aae5ec1b06eebb519adee9ba5174233ed20c6975c7"} Feb 19 19:39:30 crc kubenswrapper[4722]: I0219 19:39:30.884800 4722 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="54745880-0d6d-432b-be90-a609a4f4bff6" podUID="af557f35-ca9e-4990-bdcb-9e44366dab68" Feb 19 19:39:31 crc kubenswrapper[4722]: I0219 19:39:31.085336 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54745880-0d6d-432b-be90-a609a4f4bff6" path="/var/lib/kubelet/pods/54745880-0d6d-432b-be90-a609a4f4bff6/volumes" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.325085 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.325490 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="sg-core" containerID="cri-o://281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b" gracePeriod=30 Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.325651 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="proxy-httpd" containerID="cri-o://e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df" gracePeriod=30 Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.325726 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="ceilometer-notification-agent" containerID="cri-o://870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5" gracePeriod=30 Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.325393 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="ceilometer-central-agent" containerID="cri-o://51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d" gracePeriod=30 Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.335274 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.184:3000/\": EOF" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.630569 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-b7b95d7bc-zqb9x"] Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.633376 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.638046 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.638059 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.638972 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.656545 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-b7b95d7bc-zqb9x"] Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.722778 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsn6q\" (UniqueName: \"kubernetes.io/projected/42a3f824-28fe-4734-8ada-a74ffb9930a8-kube-api-access-tsn6q\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.722846 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/42a3f824-28fe-4734-8ada-a74ffb9930a8-etc-swift\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.722878 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42a3f824-28fe-4734-8ada-a74ffb9930a8-run-httpd\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.722974 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42a3f824-28fe-4734-8ada-a74ffb9930a8-log-httpd\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.723050 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-combined-ca-bundle\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.723095 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-public-tls-certs\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.723121 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-internal-tls-certs\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.723207 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-config-data\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.825271 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-config-data\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.825352 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsn6q\" (UniqueName: \"kubernetes.io/projected/42a3f824-28fe-4734-8ada-a74ffb9930a8-kube-api-access-tsn6q\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.825412 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/42a3f824-28fe-4734-8ada-a74ffb9930a8-etc-swift\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.825450 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42a3f824-28fe-4734-8ada-a74ffb9930a8-run-httpd\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.825508 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42a3f824-28fe-4734-8ada-a74ffb9930a8-log-httpd\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.825561 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-combined-ca-bundle\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.825607 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-public-tls-certs\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.825637 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-internal-tls-certs\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.826408 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42a3f824-28fe-4734-8ada-a74ffb9930a8-run-httpd\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.827476 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42a3f824-28fe-4734-8ada-a74ffb9930a8-log-httpd\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.832726 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-internal-tls-certs\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.832768 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-public-tls-certs\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.834330 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-config-data\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.834326 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42a3f824-28fe-4734-8ada-a74ffb9930a8-combined-ca-bundle\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.835181 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/42a3f824-28fe-4734-8ada-a74ffb9930a8-etc-swift\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.843274 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsn6q\" (UniqueName: \"kubernetes.io/projected/42a3f824-28fe-4734-8ada-a74ffb9930a8-kube-api-access-tsn6q\") pod \"swift-proxy-b7b95d7bc-zqb9x\" (UID: \"42a3f824-28fe-4734-8ada-a74ffb9930a8\") " pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.890416 4722 generic.go:334] "Generic (PLEG): container finished" podID="41000a66-e725-4b1e-ab9c-31251213e311" containerID="e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df" exitCode=0 Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.890447 4722 generic.go:334] "Generic (PLEG): container finished" podID="41000a66-e725-4b1e-ab9c-31251213e311" containerID="281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b" exitCode=2 Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.890456 4722 generic.go:334] "Generic (PLEG): container finished" podID="41000a66-e725-4b1e-ab9c-31251213e311" containerID="51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d" exitCode=0 Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.890476 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41000a66-e725-4b1e-ab9c-31251213e311","Type":"ContainerDied","Data":"e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df"} Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.890500 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41000a66-e725-4b1e-ab9c-31251213e311","Type":"ContainerDied","Data":"281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b"} Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.890509 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41000a66-e725-4b1e-ab9c-31251213e311","Type":"ContainerDied","Data":"51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d"} Feb 19 19:39:32 crc kubenswrapper[4722]: I0219 19:39:32.959733 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:33 crc kubenswrapper[4722]: I0219 19:39:33.548659 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-b7b95d7bc-zqb9x"] Feb 19 19:39:33 crc kubenswrapper[4722]: W0219 19:39:33.567243 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42a3f824_28fe_4734_8ada_a74ffb9930a8.slice/crio-9a6ccaa264067c666f82b7782dd959f9cb56e2c1ccdd78818942da5cd87dab4d WatchSource:0}: Error finding container 9a6ccaa264067c666f82b7782dd959f9cb56e2c1ccdd78818942da5cd87dab4d: Status 404 returned error can't find the container with id 9a6ccaa264067c666f82b7782dd959f9cb56e2c1ccdd78818942da5cd87dab4d Feb 19 19:39:33 crc kubenswrapper[4722]: I0219 19:39:33.908755 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-b7b95d7bc-zqb9x" event={"ID":"42a3f824-28fe-4734-8ada-a74ffb9930a8","Type":"ContainerStarted","Data":"9bb6330deff38fd0d18ecd03ae5cb24c9789fc21f1750510e2039924d227c72d"} Feb 19 19:39:33 crc kubenswrapper[4722]: I0219 19:39:33.909171 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-b7b95d7bc-zqb9x" event={"ID":"42a3f824-28fe-4734-8ada-a74ffb9930a8","Type":"ContainerStarted","Data":"9a6ccaa264067c666f82b7782dd959f9cb56e2c1ccdd78818942da5cd87dab4d"} Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.177601 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-8694c7b8f7-2td8g" Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.255351 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7445db86-7r6w9"] Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.255798 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7445db86-7r6w9" podUID="cff58b5f-4c6b-44be-b668-15b2948e6af0" containerName="neutron-api" containerID="cri-o://6956d55506ad813de368c67533400189dca7fad85038770d3e67703d4229d5da" gracePeriod=30 Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.256197 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7445db86-7r6w9" podUID="cff58b5f-4c6b-44be-b668-15b2948e6af0" containerName="neutron-httpd" containerID="cri-o://df36524cd2a523caf0ae3f85ddef265e7c54e5ba8fa2da85c3fd083ca4ebd887" gracePeriod=30 Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.906213 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.934132 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-b7b95d7bc-zqb9x" event={"ID":"42a3f824-28fe-4734-8ada-a74ffb9930a8","Type":"ContainerStarted","Data":"79793a96c41434eb0bb076408cbe18073688933274ac3aeaeacb989b708b5d57"} Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.935040 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.935068 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.937512 4722 generic.go:334] "Generic (PLEG): container finished" podID="cff58b5f-4c6b-44be-b668-15b2948e6af0" containerID="df36524cd2a523caf0ae3f85ddef265e7c54e5ba8fa2da85c3fd083ca4ebd887" exitCode=0 Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.937556 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7445db86-7r6w9" event={"ID":"cff58b5f-4c6b-44be-b668-15b2948e6af0","Type":"ContainerDied","Data":"df36524cd2a523caf0ae3f85ddef265e7c54e5ba8fa2da85c3fd083ca4ebd887"} Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.961397 4722 generic.go:334] "Generic (PLEG): container finished" podID="41000a66-e725-4b1e-ab9c-31251213e311" containerID="870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5" exitCode=0 Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.961443 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41000a66-e725-4b1e-ab9c-31251213e311","Type":"ContainerDied","Data":"870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5"} Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.961476 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41000a66-e725-4b1e-ab9c-31251213e311","Type":"ContainerDied","Data":"2fc5d288c8b590c8621fd130a7dd63655d59f6c92407b8882f0ffae525ddf63d"} Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.961495 4722 scope.go:117] "RemoveContainer" containerID="e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df" Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.961641 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.980769 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-b7b95d7bc-zqb9x" podStartSLOduration=2.980737873 podStartE2EDuration="2.980737873s" podCreationTimestamp="2026-02-19 19:39:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:34.978325528 +0000 UTC m=+1274.590675872" watchObservedRunningTime="2026-02-19 19:39:34.980737873 +0000 UTC m=+1274.593088197" Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.994534 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-combined-ca-bundle\") pod \"41000a66-e725-4b1e-ab9c-31251213e311\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.994659 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-scripts\") pod \"41000a66-e725-4b1e-ab9c-31251213e311\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.994691 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-log-httpd\") pod \"41000a66-e725-4b1e-ab9c-31251213e311\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.994718 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-config-data\") pod \"41000a66-e725-4b1e-ab9c-31251213e311\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.994741 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72xgb\" (UniqueName: \"kubernetes.io/projected/41000a66-e725-4b1e-ab9c-31251213e311-kube-api-access-72xgb\") pod \"41000a66-e725-4b1e-ab9c-31251213e311\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.994834 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-sg-core-conf-yaml\") pod \"41000a66-e725-4b1e-ab9c-31251213e311\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.994881 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-run-httpd\") pod \"41000a66-e725-4b1e-ab9c-31251213e311\" (UID: \"41000a66-e725-4b1e-ab9c-31251213e311\") " Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.995689 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "41000a66-e725-4b1e-ab9c-31251213e311" (UID: "41000a66-e725-4b1e-ab9c-31251213e311"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:34 crc kubenswrapper[4722]: I0219 19:39:34.997049 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "41000a66-e725-4b1e-ab9c-31251213e311" (UID: "41000a66-e725-4b1e-ab9c-31251213e311"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.001610 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-scripts" (OuterVolumeSpecName: "scripts") pod "41000a66-e725-4b1e-ab9c-31251213e311" (UID: "41000a66-e725-4b1e-ab9c-31251213e311"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.008382 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41000a66-e725-4b1e-ab9c-31251213e311-kube-api-access-72xgb" (OuterVolumeSpecName: "kube-api-access-72xgb") pod "41000a66-e725-4b1e-ab9c-31251213e311" (UID: "41000a66-e725-4b1e-ab9c-31251213e311"). InnerVolumeSpecName "kube-api-access-72xgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.036023 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "41000a66-e725-4b1e-ab9c-31251213e311" (UID: "41000a66-e725-4b1e-ab9c-31251213e311"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.097138 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.097190 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.097200 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.097209 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41000a66-e725-4b1e-ab9c-31251213e311-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.097220 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72xgb\" (UniqueName: \"kubernetes.io/projected/41000a66-e725-4b1e-ab9c-31251213e311-kube-api-access-72xgb\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.126781 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-config-data" (OuterVolumeSpecName: "config-data") pod "41000a66-e725-4b1e-ab9c-31251213e311" (UID: "41000a66-e725-4b1e-ab9c-31251213e311"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.140691 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41000a66-e725-4b1e-ab9c-31251213e311" (UID: "41000a66-e725-4b1e-ab9c-31251213e311"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.198572 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.198611 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41000a66-e725-4b1e-ab9c-31251213e311-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.321517 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.335176 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.349297 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:35 crc kubenswrapper[4722]: E0219 19:39:35.349759 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="proxy-httpd" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.349773 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="proxy-httpd" Feb 19 19:39:35 crc kubenswrapper[4722]: E0219 19:39:35.349803 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="sg-core" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.349808 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="sg-core" Feb 19 19:39:35 crc kubenswrapper[4722]: E0219 19:39:35.349824 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="ceilometer-notification-agent" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.349831 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="ceilometer-notification-agent" Feb 19 19:39:35 crc kubenswrapper[4722]: E0219 19:39:35.349840 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="ceilometer-central-agent" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.349846 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="ceilometer-central-agent" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.350017 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="ceilometer-central-agent" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.350032 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="sg-core" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.350055 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="proxy-httpd" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.350064 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="41000a66-e725-4b1e-ab9c-31251213e311" containerName="ceilometer-notification-agent" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.351916 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.357214 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.358460 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.361472 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.505164 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcll5\" (UniqueName: \"kubernetes.io/projected/d3534949-6af7-4bf0-ba36-ed96804ada1b-kube-api-access-dcll5\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.505224 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.505262 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.505299 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-run-httpd\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.505450 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-config-data\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.505473 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-log-httpd\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.505537 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-scripts\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.606974 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-config-data\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.607015 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-log-httpd\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.607046 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-scripts\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.607120 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcll5\" (UniqueName: \"kubernetes.io/projected/d3534949-6af7-4bf0-ba36-ed96804ada1b-kube-api-access-dcll5\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.607137 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.607170 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.607194 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-run-httpd\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.607592 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-run-httpd\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.608482 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-log-httpd\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.613329 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-config-data\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.614005 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-scripts\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.616841 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.617187 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.647741 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcll5\" (UniqueName: \"kubernetes.io/projected/d3534949-6af7-4bf0-ba36-ed96804ada1b-kube-api-access-dcll5\") pod \"ceilometer-0\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " pod="openstack/ceilometer-0" Feb 19 19:39:35 crc kubenswrapper[4722]: I0219 19:39:35.679791 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:39:36 crc kubenswrapper[4722]: I0219 19:39:36.197121 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-d74fd689-q5qhb" podUID="5c88f138-094d-44c0-b1c9-1492e7e11e9b" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.171:9696/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 19:39:37 crc kubenswrapper[4722]: I0219 19:39:37.088878 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41000a66-e725-4b1e-ab9c-31251213e311" path="/var/lib/kubelet/pods/41000a66-e725-4b1e-ab9c-31251213e311/volumes" Feb 19 19:39:38 crc kubenswrapper[4722]: I0219 19:39:38.374569 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:39:38 crc kubenswrapper[4722]: I0219 19:39:38.375196 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerName="glance-log" containerID="cri-o://0f3ddcaf8c81704eaf6b201c98a6bdf76e2b380c4dac2d9db9d77cb9f737e62a" gracePeriod=30 Feb 19 19:39:38 crc kubenswrapper[4722]: I0219 19:39:38.375363 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerName="glance-httpd" containerID="cri-o://3ce9bc56dc0250472fbd7d818bb628d5fdf7798657a6fd7b1570bd5c3b64c1ae" gracePeriod=30 Feb 19 19:39:39 crc kubenswrapper[4722]: I0219 19:39:39.011015 4722 generic.go:334] "Generic (PLEG): container finished" podID="cff58b5f-4c6b-44be-b668-15b2948e6af0" containerID="6956d55506ad813de368c67533400189dca7fad85038770d3e67703d4229d5da" exitCode=0 Feb 19 19:39:39 crc kubenswrapper[4722]: I0219 19:39:39.011086 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7445db86-7r6w9" event={"ID":"cff58b5f-4c6b-44be-b668-15b2948e6af0","Type":"ContainerDied","Data":"6956d55506ad813de368c67533400189dca7fad85038770d3e67703d4229d5da"} Feb 19 19:39:39 crc kubenswrapper[4722]: I0219 19:39:39.013682 4722 generic.go:334] "Generic (PLEG): container finished" podID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerID="0f3ddcaf8c81704eaf6b201c98a6bdf76e2b380c4dac2d9db9d77cb9f737e62a" exitCode=143 Feb 19 19:39:39 crc kubenswrapper[4722]: I0219 19:39:39.013725 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74c5d98f-45b4-4fd8-876b-3471da720a4b","Type":"ContainerDied","Data":"0f3ddcaf8c81704eaf6b201c98a6bdf76e2b380c4dac2d9db9d77cb9f737e62a"} Feb 19 19:39:39 crc kubenswrapper[4722]: I0219 19:39:39.135454 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:39:39 crc kubenswrapper[4722]: I0219 19:39:39.135685 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="58e51a47-7d37-46de-96cc-609365fab496" containerName="glance-log" containerID="cri-o://b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd" gracePeriod=30 Feb 19 19:39:39 crc kubenswrapper[4722]: I0219 19:39:39.135813 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="58e51a47-7d37-46de-96cc-609365fab496" containerName="glance-httpd" containerID="cri-o://1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7" gracePeriod=30 Feb 19 19:39:39 crc kubenswrapper[4722]: I0219 19:39:39.997190 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.028674 4722 generic.go:334] "Generic (PLEG): container finished" podID="58e51a47-7d37-46de-96cc-609365fab496" containerID="b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd" exitCode=143 Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.028715 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58e51a47-7d37-46de-96cc-609365fab496","Type":"ContainerDied","Data":"b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd"} Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.642651 4722 scope.go:117] "RemoveContainer" containerID="281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b" Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.791693 4722 scope.go:117] "RemoveContainer" containerID="870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5" Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.817675 4722 scope.go:117] "RemoveContainer" containerID="51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d" Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.937503 4722 scope.go:117] "RemoveContainer" containerID="e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df" Feb 19 19:39:40 crc kubenswrapper[4722]: E0219 19:39:40.940977 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df\": container with ID starting with e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df not found: ID does not exist" containerID="e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df" Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.941057 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df"} err="failed to get container status \"e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df\": rpc error: code = NotFound desc = could not find container \"e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df\": container with ID starting with e3f604d0feb854bb6622ffa83cb4b4e74319f0ff44d81c1ef6787137f2fa99df not found: ID does not exist" Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.941121 4722 scope.go:117] "RemoveContainer" containerID="281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b" Feb 19 19:39:40 crc kubenswrapper[4722]: E0219 19:39:40.941519 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b\": container with ID starting with 281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b not found: ID does not exist" containerID="281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b" Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.941544 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b"} err="failed to get container status \"281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b\": rpc error: code = NotFound desc = could not find container \"281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b\": container with ID starting with 281a64ea15d26a2a4b2eec0344c306c244080f9473a7072ac04a28a4ff2a126b not found: ID does not exist" Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.941561 4722 scope.go:117] "RemoveContainer" containerID="870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5" Feb 19 19:39:40 crc kubenswrapper[4722]: E0219 19:39:40.942045 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5\": container with ID starting with 870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5 not found: ID does not exist" containerID="870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5" Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.942067 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5"} err="failed to get container status \"870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5\": rpc error: code = NotFound desc = could not find container \"870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5\": container with ID starting with 870fc694ce17093493698b2f076f91e551f522ed7f916207fc79d41779ffacc5 not found: ID does not exist" Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.942082 4722 scope.go:117] "RemoveContainer" containerID="51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d" Feb 19 19:39:40 crc kubenswrapper[4722]: E0219 19:39:40.944230 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d\": container with ID starting with 51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d not found: ID does not exist" containerID="51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d" Feb 19 19:39:40 crc kubenswrapper[4722]: I0219 19:39:40.944274 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d"} err="failed to get container status \"51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d\": rpc error: code = NotFound desc = could not find container \"51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d\": container with ID starting with 51a5de0f121a1c428a7fd29f001da7794388f32fa9d6cfd22fe28bac3c11288d not found: ID does not exist" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.044881 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7445db86-7r6w9" event={"ID":"cff58b5f-4c6b-44be-b668-15b2948e6af0","Type":"ContainerDied","Data":"0d269d0087152d6edd92c6c1c2324f5e6566d6cbbbcd03d88628b974769fb6f5"} Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.044917 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d269d0087152d6edd92c6c1c2324f5e6566d6cbbbcd03d88628b974769fb6f5" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.046063 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.054141 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"af557f35-ca9e-4990-bdcb-9e44366dab68","Type":"ContainerStarted","Data":"b233726d8e4e40dc0e1cb1f2de9faba0ca9d794275a44bf5c5a4deb84d8f9b15"} Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.119570 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.432882422 podStartE2EDuration="13.119554797s" podCreationTimestamp="2026-02-19 19:39:28 +0000 UTC" firstStartedPulling="2026-02-19 19:39:29.991377654 +0000 UTC m=+1269.603727978" lastFinishedPulling="2026-02-19 19:39:40.678050039 +0000 UTC m=+1280.290400353" observedRunningTime="2026-02-19 19:39:41.096425808 +0000 UTC m=+1280.708776182" watchObservedRunningTime="2026-02-19 19:39:41.119554797 +0000 UTC m=+1280.731905121" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.121634 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-combined-ca-bundle\") pod \"cff58b5f-4c6b-44be-b668-15b2948e6af0\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.121718 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-config\") pod \"cff58b5f-4c6b-44be-b668-15b2948e6af0\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.121755 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-ovndb-tls-certs\") pod \"cff58b5f-4c6b-44be-b668-15b2948e6af0\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.121862 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbx7r\" (UniqueName: \"kubernetes.io/projected/cff58b5f-4c6b-44be-b668-15b2948e6af0-kube-api-access-dbx7r\") pod \"cff58b5f-4c6b-44be-b668-15b2948e6af0\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.121965 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-httpd-config\") pod \"cff58b5f-4c6b-44be-b668-15b2948e6af0\" (UID: \"cff58b5f-4c6b-44be-b668-15b2948e6af0\") " Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.128854 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cff58b5f-4c6b-44be-b668-15b2948e6af0-kube-api-access-dbx7r" (OuterVolumeSpecName: "kube-api-access-dbx7r") pod "cff58b5f-4c6b-44be-b668-15b2948e6af0" (UID: "cff58b5f-4c6b-44be-b668-15b2948e6af0"). InnerVolumeSpecName "kube-api-access-dbx7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.130884 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "cff58b5f-4c6b-44be-b668-15b2948e6af0" (UID: "cff58b5f-4c6b-44be-b668-15b2948e6af0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.185316 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cff58b5f-4c6b-44be-b668-15b2948e6af0" (UID: "cff58b5f-4c6b-44be-b668-15b2948e6af0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.211779 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "cff58b5f-4c6b-44be-b668-15b2948e6af0" (UID: "cff58b5f-4c6b-44be-b668-15b2948e6af0"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.220794 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.224659 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-config" (OuterVolumeSpecName: "config") pod "cff58b5f-4c6b-44be-b668-15b2948e6af0" (UID: "cff58b5f-4c6b-44be-b668-15b2948e6af0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.224734 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:41 crc kubenswrapper[4722]: W0219 19:39:41.224750 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3534949_6af7_4bf0_ba36_ed96804ada1b.slice/crio-ed02ab2736a8a5c0a79a953eecaf3af4a07414cf3f0246382c94a16fdd33f1ab WatchSource:0}: Error finding container ed02ab2736a8a5c0a79a953eecaf3af4a07414cf3f0246382c94a16fdd33f1ab: Status 404 returned error can't find the container with id ed02ab2736a8a5c0a79a953eecaf3af4a07414cf3f0246382c94a16fdd33f1ab Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.224759 4722 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.224793 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbx7r\" (UniqueName: \"kubernetes.io/projected/cff58b5f-4c6b-44be-b668-15b2948e6af0-kube-api-access-dbx7r\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.224808 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.229645 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:39:41 crc kubenswrapper[4722]: I0219 19:39:41.326562 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cff58b5f-4c6b-44be-b668-15b2948e6af0-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.062183 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3534949-6af7-4bf0-ba36-ed96804ada1b","Type":"ContainerStarted","Data":"ed02ab2736a8a5c0a79a953eecaf3af4a07414cf3f0246382c94a16fdd33f1ab"} Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.066180 4722 generic.go:334] "Generic (PLEG): container finished" podID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerID="3ce9bc56dc0250472fbd7d818bb628d5fdf7798657a6fd7b1570bd5c3b64c1ae" exitCode=0 Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.066355 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74c5d98f-45b4-4fd8-876b-3471da720a4b","Type":"ContainerDied","Data":"3ce9bc56dc0250472fbd7d818bb628d5fdf7798657a6fd7b1570bd5c3b64c1ae"} Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.066471 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"74c5d98f-45b4-4fd8-876b-3471da720a4b","Type":"ContainerDied","Data":"8e47c9d091e13fff038509a7c2d6d944fe2c289a1a215ce8e16c3e4cee4c648d"} Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.066552 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e47c9d091e13fff038509a7c2d6d944fe2c289a1a215ce8e16c3e4cee4c648d" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.066437 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7445db86-7r6w9" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.068890 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.137637 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7445db86-7r6w9"] Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.144123 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-config-data\") pod \"74c5d98f-45b4-4fd8-876b-3471da720a4b\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.144196 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-httpd-run\") pod \"74c5d98f-45b4-4fd8-876b-3471da720a4b\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.144231 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-logs\") pod \"74c5d98f-45b4-4fd8-876b-3471da720a4b\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.144508 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"74c5d98f-45b4-4fd8-876b-3471da720a4b\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.144855 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6q6t\" (UniqueName: \"kubernetes.io/projected/74c5d98f-45b4-4fd8-876b-3471da720a4b-kube-api-access-h6q6t\") pod \"74c5d98f-45b4-4fd8-876b-3471da720a4b\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.144878 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-combined-ca-bundle\") pod \"74c5d98f-45b4-4fd8-876b-3471da720a4b\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.144990 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-public-tls-certs\") pod \"74c5d98f-45b4-4fd8-876b-3471da720a4b\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.145017 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-scripts\") pod \"74c5d98f-45b4-4fd8-876b-3471da720a4b\" (UID: \"74c5d98f-45b4-4fd8-876b-3471da720a4b\") " Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.145772 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "74c5d98f-45b4-4fd8-876b-3471da720a4b" (UID: "74c5d98f-45b4-4fd8-876b-3471da720a4b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.145899 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-logs" (OuterVolumeSpecName: "logs") pod "74c5d98f-45b4-4fd8-876b-3471da720a4b" (UID: "74c5d98f-45b4-4fd8-876b-3471da720a4b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.148672 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.148694 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74c5d98f-45b4-4fd8-876b-3471da720a4b-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.153619 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7445db86-7r6w9"] Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.166409 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-scripts" (OuterVolumeSpecName: "scripts") pod "74c5d98f-45b4-4fd8-876b-3471da720a4b" (UID: "74c5d98f-45b4-4fd8-876b-3471da720a4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.167265 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74c5d98f-45b4-4fd8-876b-3471da720a4b-kube-api-access-h6q6t" (OuterVolumeSpecName: "kube-api-access-h6q6t") pod "74c5d98f-45b4-4fd8-876b-3471da720a4b" (UID: "74c5d98f-45b4-4fd8-876b-3471da720a4b"). InnerVolumeSpecName "kube-api-access-h6q6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.189082 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d" (OuterVolumeSpecName: "glance") pod "74c5d98f-45b4-4fd8-876b-3471da720a4b" (UID: "74c5d98f-45b4-4fd8-876b-3471da720a4b"). InnerVolumeSpecName "pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.217891 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74c5d98f-45b4-4fd8-876b-3471da720a4b" (UID: "74c5d98f-45b4-4fd8-876b-3471da720a4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.219270 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-config-data" (OuterVolumeSpecName: "config-data") pod "74c5d98f-45b4-4fd8-876b-3471da720a4b" (UID: "74c5d98f-45b4-4fd8-876b-3471da720a4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.247665 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "74c5d98f-45b4-4fd8-876b-3471da720a4b" (UID: "74c5d98f-45b4-4fd8-876b-3471da720a4b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.250388 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.250422 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.250434 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.250469 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") on node \"crc\" " Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.250484 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6q6t\" (UniqueName: \"kubernetes.io/projected/74c5d98f-45b4-4fd8-876b-3471da720a4b-kube-api-access-h6q6t\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.250496 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c5d98f-45b4-4fd8-876b-3471da720a4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.295616 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.295778 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d") on node "crc" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.306134 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="58e51a47-7d37-46de-96cc-609365fab496" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.166:9292/healthcheck\": read tcp 10.217.0.2:41080->10.217.0.166:9292: read: connection reset by peer" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.306560 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="58e51a47-7d37-46de-96cc-609365fab496" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.166:9292/healthcheck\": read tcp 10.217.0.2:41094->10.217.0.166:9292: read: connection reset by peer" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.352029 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:42 crc kubenswrapper[4722]: I0219 19:39:42.923690 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.063419 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-config-data\") pod \"58e51a47-7d37-46de-96cc-609365fab496\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.063695 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g46np\" (UniqueName: \"kubernetes.io/projected/58e51a47-7d37-46de-96cc-609365fab496-kube-api-access-g46np\") pod \"58e51a47-7d37-46de-96cc-609365fab496\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.063788 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-httpd-run\") pod \"58e51a47-7d37-46de-96cc-609365fab496\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.063876 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-combined-ca-bundle\") pod \"58e51a47-7d37-46de-96cc-609365fab496\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.063920 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-scripts\") pod \"58e51a47-7d37-46de-96cc-609365fab496\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.063935 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-logs\") pod \"58e51a47-7d37-46de-96cc-609365fab496\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.064001 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-internal-tls-certs\") pod \"58e51a47-7d37-46de-96cc-609365fab496\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.064142 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"58e51a47-7d37-46de-96cc-609365fab496\" (UID: \"58e51a47-7d37-46de-96cc-609365fab496\") " Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.065757 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "58e51a47-7d37-46de-96cc-609365fab496" (UID: "58e51a47-7d37-46de-96cc-609365fab496"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.075690 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-logs" (OuterVolumeSpecName: "logs") pod "58e51a47-7d37-46de-96cc-609365fab496" (UID: "58e51a47-7d37-46de-96cc-609365fab496"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.094002 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-scripts" (OuterVolumeSpecName: "scripts") pod "58e51a47-7d37-46de-96cc-609365fab496" (UID: "58e51a47-7d37-46de-96cc-609365fab496"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.110582 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58e51a47-7d37-46de-96cc-609365fab496-kube-api-access-g46np" (OuterVolumeSpecName: "kube-api-access-g46np") pod "58e51a47-7d37-46de-96cc-609365fab496" (UID: "58e51a47-7d37-46de-96cc-609365fab496"). InnerVolumeSpecName "kube-api-access-g46np". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.140981 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cff58b5f-4c6b-44be-b668-15b2948e6af0" path="/var/lib/kubelet/pods/cff58b5f-4c6b-44be-b668-15b2948e6af0/volumes" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.156893 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58e51a47-7d37-46de-96cc-609365fab496" (UID: "58e51a47-7d37-46de-96cc-609365fab496"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.166333 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g46np\" (UniqueName: \"kubernetes.io/projected/58e51a47-7d37-46de-96cc-609365fab496-kube-api-access-g46np\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.167015 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.167109 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.167183 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58e51a47-7d37-46de-96cc-609365fab496-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.167246 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.179690 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.181407 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-b7b95d7bc-zqb9x" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.184589 4722 generic.go:334] "Generic (PLEG): container finished" podID="58e51a47-7d37-46de-96cc-609365fab496" containerID="1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7" exitCode=0 Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.184671 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58e51a47-7d37-46de-96cc-609365fab496","Type":"ContainerDied","Data":"1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7"} Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.184698 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"58e51a47-7d37-46de-96cc-609365fab496","Type":"ContainerDied","Data":"9d53207634e6b7ef9226749da4be244094bd8e2655c281755c661fa33e7511ac"} Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.184716 4722 scope.go:117] "RemoveContainer" containerID="1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.184848 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.194455 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.197828 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3534949-6af7-4bf0-ba36-ed96804ada1b","Type":"ContainerStarted","Data":"14558b2b43b12bd6f938bfe33b938c7705b1528f8c8be67e451dfa9069d61fa8"} Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.226135 4722 scope.go:117] "RemoveContainer" containerID="b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.288340 4722 scope.go:117] "RemoveContainer" containerID="1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7" Feb 19 19:39:43 crc kubenswrapper[4722]: E0219 19:39:43.288720 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7\": container with ID starting with 1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7 not found: ID does not exist" containerID="1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.288829 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7"} err="failed to get container status \"1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7\": rpc error: code = NotFound desc = could not find container \"1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7\": container with ID starting with 1013399b72c2818ff8e6ac0662958cf73255fecba9462a8d39b8c1d126038ee7 not found: ID does not exist" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.288906 4722 scope.go:117] "RemoveContainer" containerID="b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd" Feb 19 19:39:43 crc kubenswrapper[4722]: E0219 19:39:43.289259 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd\": container with ID starting with b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd not found: ID does not exist" containerID="b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.289303 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd"} err="failed to get container status \"b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd\": rpc error: code = NotFound desc = could not find container \"b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd\": container with ID starting with b412249d2c4b40e48b0ee8187f37d594d62e8acaa4c4d41c7f7f42ad9753e2bd not found: ID does not exist" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.300666 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.341620 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355057 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:39:43 crc kubenswrapper[4722]: E0219 19:39:43.355511 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cff58b5f-4c6b-44be-b668-15b2948e6af0" containerName="neutron-api" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355534 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cff58b5f-4c6b-44be-b668-15b2948e6af0" containerName="neutron-api" Feb 19 19:39:43 crc kubenswrapper[4722]: E0219 19:39:43.355547 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerName="glance-log" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355553 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerName="glance-log" Feb 19 19:39:43 crc kubenswrapper[4722]: E0219 19:39:43.355565 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e51a47-7d37-46de-96cc-609365fab496" containerName="glance-httpd" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355572 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e51a47-7d37-46de-96cc-609365fab496" containerName="glance-httpd" Feb 19 19:39:43 crc kubenswrapper[4722]: E0219 19:39:43.355603 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e51a47-7d37-46de-96cc-609365fab496" containerName="glance-log" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355609 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e51a47-7d37-46de-96cc-609365fab496" containerName="glance-log" Feb 19 19:39:43 crc kubenswrapper[4722]: E0219 19:39:43.355617 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cff58b5f-4c6b-44be-b668-15b2948e6af0" containerName="neutron-httpd" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355622 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cff58b5f-4c6b-44be-b668-15b2948e6af0" containerName="neutron-httpd" Feb 19 19:39:43 crc kubenswrapper[4722]: E0219 19:39:43.355631 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerName="glance-httpd" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355637 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerName="glance-httpd" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355799 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="58e51a47-7d37-46de-96cc-609365fab496" containerName="glance-httpd" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355812 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerName="glance-log" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355823 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="58e51a47-7d37-46de-96cc-609365fab496" containerName="glance-log" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355832 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cff58b5f-4c6b-44be-b668-15b2948e6af0" containerName="neutron-api" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355853 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerName="glance-httpd" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.355865 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cff58b5f-4c6b-44be-b668-15b2948e6af0" containerName="neutron-httpd" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.356961 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.359579 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.359815 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.363752 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5" (OuterVolumeSpecName: "glance") pod "58e51a47-7d37-46de-96cc-609365fab496" (UID: "58e51a47-7d37-46de-96cc-609365fab496"). InnerVolumeSpecName "pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.376706 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.378216 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") on node \"crc\" " Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.434724 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "58e51a47-7d37-46de-96cc-609365fab496" (UID: "58e51a47-7d37-46de-96cc-609365fab496"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.462546 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-config-data" (OuterVolumeSpecName: "config-data") pod "58e51a47-7d37-46de-96cc-609365fab496" (UID: "58e51a47-7d37-46de-96cc-609365fab496"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.473190 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.473339 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5") on node "crc" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.479662 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.479727 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.479759 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84bb340d-f999-45fc-8e1c-d813e2ad4319-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.479797 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpqkj\" (UniqueName: \"kubernetes.io/projected/84bb340d-f999-45fc-8e1c-d813e2ad4319-kube-api-access-qpqkj\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.479825 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-config-data\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.479872 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84bb340d-f999-45fc-8e1c-d813e2ad4319-logs\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.479918 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.479944 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-scripts\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.480009 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.480021 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.480030 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e51a47-7d37-46de-96cc-609365fab496-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.581296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpqkj\" (UniqueName: \"kubernetes.io/projected/84bb340d-f999-45fc-8e1c-d813e2ad4319-kube-api-access-qpqkj\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.581658 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-config-data\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.581740 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84bb340d-f999-45fc-8e1c-d813e2ad4319-logs\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.581808 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.581843 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-scripts\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.581903 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.581948 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.581986 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84bb340d-f999-45fc-8e1c-d813e2ad4319-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.582496 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/84bb340d-f999-45fc-8e1c-d813e2ad4319-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.586916 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84bb340d-f999-45fc-8e1c-d813e2ad4319-logs\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.588543 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.589429 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.589542 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.591968 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-scripts\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.592869 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bb340d-f999-45fc-8e1c-d813e2ad4319-config-data\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.598472 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.598521 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2019fecbddc337ddf53783637eb0008bc901e49a55294deb1e2d06fbb77c3ae3/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.602563 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.606831 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpqkj\" (UniqueName: \"kubernetes.io/projected/84bb340d-f999-45fc-8e1c-d813e2ad4319-kube-api-access-qpqkj\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.640081 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.675130 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.678896 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.680345 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.728288 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.752301 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99badf1b-2964-4b9a-af9b-a90cfb1ef39d\") pod \"glance-default-external-api-0\" (UID: \"84bb340d-f999-45fc-8e1c-d813e2ad4319\") " pod="openstack/glance-default-external-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.800352 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-config-data\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.800417 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-scripts\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.800447 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.800471 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99490f57-22ed-4652-a112-bf45feb67aee-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.800527 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.800550 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.800568 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99490f57-22ed-4652-a112-bf45feb67aee-logs\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.800606 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv8lg\" (UniqueName: \"kubernetes.io/projected/99490f57-22ed-4652-a112-bf45feb67aee-kube-api-access-cv8lg\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.901989 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99490f57-22ed-4652-a112-bf45feb67aee-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.902080 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.902111 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.902130 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99490f57-22ed-4652-a112-bf45feb67aee-logs\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.902185 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv8lg\" (UniqueName: \"kubernetes.io/projected/99490f57-22ed-4652-a112-bf45feb67aee-kube-api-access-cv8lg\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.902246 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-config-data\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.902277 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-scripts\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.902305 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.903550 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99490f57-22ed-4652-a112-bf45feb67aee-logs\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.903839 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99490f57-22ed-4652-a112-bf45feb67aee-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.907928 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.912927 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.913642 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-scripts\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.926078 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99490f57-22ed-4652-a112-bf45feb67aee-config-data\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.927534 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv8lg\" (UniqueName: \"kubernetes.io/projected/99490f57-22ed-4652-a112-bf45feb67aee-kube-api-access-cv8lg\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.930676 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.930718 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b323df4ccd136fd865256cd83fe693e56c32fbc8a05d96b41caf6babb703da86/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:43 crc kubenswrapper[4722]: I0219 19:39:43.979245 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 19:39:44 crc kubenswrapper[4722]: I0219 19:39:44.064037 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1a0b24b-75aa-4c12-b8e3-b8645a4cb3c5\") pod \"glance-default-internal-api-0\" (UID: \"99490f57-22ed-4652-a112-bf45feb67aee\") " pod="openstack/glance-default-internal-api-0" Feb 19 19:39:44 crc kubenswrapper[4722]: I0219 19:39:44.072263 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:44 crc kubenswrapper[4722]: I0219 19:39:44.286533 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3534949-6af7-4bf0-ba36-ed96804ada1b","Type":"ContainerStarted","Data":"2b66f446c38c7a939e26a897ca89dd08de59cc960c89adaf35fbd0e82bf8f636"} Feb 19 19:39:44 crc kubenswrapper[4722]: W0219 19:39:44.674312 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84bb340d_f999_45fc_8e1c_d813e2ad4319.slice/crio-e6d9770cbfa1720863427de1bbfd7c1f4530fad0f1453c66cd9332a224fc1976 WatchSource:0}: Error finding container e6d9770cbfa1720863427de1bbfd7c1f4530fad0f1453c66cd9332a224fc1976: Status 404 returned error can't find the container with id e6d9770cbfa1720863427de1bbfd7c1f4530fad0f1453c66cd9332a224fc1976 Feb 19 19:39:44 crc kubenswrapper[4722]: I0219 19:39:44.675440 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 19:39:44 crc kubenswrapper[4722]: I0219 19:39:44.846200 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 19:39:44 crc kubenswrapper[4722]: W0219 19:39:44.855223 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99490f57_22ed_4652_a112_bf45feb67aee.slice/crio-ddb4237ec1a9f4d49e463fc1ed041d6a07c0296bc8824541b0767e3d445b83cd WatchSource:0}: Error finding container ddb4237ec1a9f4d49e463fc1ed041d6a07c0296bc8824541b0767e3d445b83cd: Status 404 returned error can't find the container with id ddb4237ec1a9f4d49e463fc1ed041d6a07c0296bc8824541b0767e3d445b83cd Feb 19 19:39:45 crc kubenswrapper[4722]: I0219 19:39:45.085123 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58e51a47-7d37-46de-96cc-609365fab496" path="/var/lib/kubelet/pods/58e51a47-7d37-46de-96cc-609365fab496/volumes" Feb 19 19:39:45 crc kubenswrapper[4722]: I0219 19:39:45.086969 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" path="/var/lib/kubelet/pods/74c5d98f-45b4-4fd8-876b-3471da720a4b/volumes" Feb 19 19:39:45 crc kubenswrapper[4722]: I0219 19:39:45.298751 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"84bb340d-f999-45fc-8e1c-d813e2ad4319","Type":"ContainerStarted","Data":"e6d9770cbfa1720863427de1bbfd7c1f4530fad0f1453c66cd9332a224fc1976"} Feb 19 19:39:45 crc kubenswrapper[4722]: I0219 19:39:45.300022 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"99490f57-22ed-4652-a112-bf45feb67aee","Type":"ContainerStarted","Data":"ddb4237ec1a9f4d49e463fc1ed041d6a07c0296bc8824541b0767e3d445b83cd"} Feb 19 19:39:45 crc kubenswrapper[4722]: I0219 19:39:45.304489 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3534949-6af7-4bf0-ba36-ed96804ada1b","Type":"ContainerStarted","Data":"a1a51025c6ac3a0493c572c91d9a6b9ce6a00a5a4e017dc5fcf2b5b985ce7e56"} Feb 19 19:39:46 crc kubenswrapper[4722]: I0219 19:39:46.324374 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"99490f57-22ed-4652-a112-bf45feb67aee","Type":"ContainerStarted","Data":"3a7b5fad0bd64aaf40effe613e01c3aabc29d539bcc949fb8f17a519a11b024b"} Feb 19 19:39:46 crc kubenswrapper[4722]: I0219 19:39:46.327951 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"84bb340d-f999-45fc-8e1c-d813e2ad4319","Type":"ContainerStarted","Data":"44101d7618ca249c3c5c563fb8a0acf458f204e06915400900d870e20bf3d61c"} Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.338881 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"99490f57-22ed-4652-a112-bf45feb67aee","Type":"ContainerStarted","Data":"870ba3af4dad9e4cb08a056b1ebb31973272630f093731881261d0918622a9a2"} Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.341530 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3534949-6af7-4bf0-ba36-ed96804ada1b","Type":"ContainerStarted","Data":"1ed5b5b084253c379ef4f64ca1d2a98bf7db526329e58a539659a2694681f3a0"} Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.341646 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="sg-core" containerID="cri-o://a1a51025c6ac3a0493c572c91d9a6b9ce6a00a5a4e017dc5fcf2b5b985ce7e56" gracePeriod=30 Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.341674 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.341641 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="ceilometer-central-agent" containerID="cri-o://14558b2b43b12bd6f938bfe33b938c7705b1528f8c8be67e451dfa9069d61fa8" gracePeriod=30 Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.341702 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="ceilometer-notification-agent" containerID="cri-o://2b66f446c38c7a939e26a897ca89dd08de59cc960c89adaf35fbd0e82bf8f636" gracePeriod=30 Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.341696 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="proxy-httpd" containerID="cri-o://1ed5b5b084253c379ef4f64ca1d2a98bf7db526329e58a539659a2694681f3a0" gracePeriod=30 Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.345412 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"84bb340d-f999-45fc-8e1c-d813e2ad4319","Type":"ContainerStarted","Data":"d99be33c8ebb5d272bc11d22e7f63115904f5bfad3e3f567d9aa8b4615656edf"} Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.362264 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.362247634 podStartE2EDuration="4.362247634s" podCreationTimestamp="2026-02-19 19:39:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:47.361144959 +0000 UTC m=+1286.973495293" watchObservedRunningTime="2026-02-19 19:39:47.362247634 +0000 UTC m=+1286.974597958" Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.383995 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.3839812 podStartE2EDuration="4.3839812s" podCreationTimestamp="2026-02-19 19:39:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:39:47.383471814 +0000 UTC m=+1286.995822138" watchObservedRunningTime="2026-02-19 19:39:47.3839812 +0000 UTC m=+1286.996331524" Feb 19 19:39:47 crc kubenswrapper[4722]: I0219 19:39:47.417618 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.881998181 podStartE2EDuration="12.417596996s" podCreationTimestamp="2026-02-19 19:39:35 +0000 UTC" firstStartedPulling="2026-02-19 19:39:41.229354073 +0000 UTC m=+1280.841704397" lastFinishedPulling="2026-02-19 19:39:46.764952868 +0000 UTC m=+1286.377303212" observedRunningTime="2026-02-19 19:39:47.410616538 +0000 UTC m=+1287.022966872" watchObservedRunningTime="2026-02-19 19:39:47.417596996 +0000 UTC m=+1287.029947330" Feb 19 19:39:48 crc kubenswrapper[4722]: I0219 19:39:48.357807 4722 generic.go:334] "Generic (PLEG): container finished" podID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerID="1ed5b5b084253c379ef4f64ca1d2a98bf7db526329e58a539659a2694681f3a0" exitCode=0 Feb 19 19:39:48 crc kubenswrapper[4722]: I0219 19:39:48.358003 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3534949-6af7-4bf0-ba36-ed96804ada1b","Type":"ContainerDied","Data":"1ed5b5b084253c379ef4f64ca1d2a98bf7db526329e58a539659a2694681f3a0"} Feb 19 19:39:48 crc kubenswrapper[4722]: I0219 19:39:48.358065 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3534949-6af7-4bf0-ba36-ed96804ada1b","Type":"ContainerDied","Data":"a1a51025c6ac3a0493c572c91d9a6b9ce6a00a5a4e017dc5fcf2b5b985ce7e56"} Feb 19 19:39:48 crc kubenswrapper[4722]: I0219 19:39:48.358026 4722 generic.go:334] "Generic (PLEG): container finished" podID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerID="a1a51025c6ac3a0493c572c91d9a6b9ce6a00a5a4e017dc5fcf2b5b985ce7e56" exitCode=2 Feb 19 19:39:48 crc kubenswrapper[4722]: I0219 19:39:48.358094 4722 generic.go:334] "Generic (PLEG): container finished" podID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerID="2b66f446c38c7a939e26a897ca89dd08de59cc960c89adaf35fbd0e82bf8f636" exitCode=0 Feb 19 19:39:48 crc kubenswrapper[4722]: I0219 19:39:48.358124 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3534949-6af7-4bf0-ba36-ed96804ada1b","Type":"ContainerDied","Data":"2b66f446c38c7a939e26a897ca89dd08de59cc960c89adaf35fbd0e82bf8f636"} Feb 19 19:39:53 crc kubenswrapper[4722]: I0219 19:39:53.402285 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Feb 19 19:39:53 crc kubenswrapper[4722]: I0219 19:39:53.979745 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 19:39:53 crc kubenswrapper[4722]: I0219 19:39:53.980076 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 19:39:54 crc kubenswrapper[4722]: I0219 19:39:54.025700 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 19:39:54 crc kubenswrapper[4722]: I0219 19:39:54.036355 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 19:39:54 crc kubenswrapper[4722]: I0219 19:39:54.073160 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:54 crc kubenswrapper[4722]: I0219 19:39:54.073211 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:54 crc kubenswrapper[4722]: I0219 19:39:54.132917 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:54 crc kubenswrapper[4722]: I0219 19:39:54.163126 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:54 crc kubenswrapper[4722]: I0219 19:39:54.423860 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:54 crc kubenswrapper[4722]: I0219 19:39:54.423906 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 19:39:54 crc kubenswrapper[4722]: I0219 19:39:54.423918 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 19:39:54 crc kubenswrapper[4722]: I0219 19:39:54.423928 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:56 crc kubenswrapper[4722]: I0219 19:39:56.655193 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:56 crc kubenswrapper[4722]: I0219 19:39:56.655765 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 19:39:56 crc kubenswrapper[4722]: I0219 19:39:56.861863 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 19:39:56 crc kubenswrapper[4722]: I0219 19:39:56.861955 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 19:39:56 crc kubenswrapper[4722]: I0219 19:39:56.885920 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 19:39:57 crc kubenswrapper[4722]: I0219 19:39:57.534071 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.491366 4722 generic.go:334] "Generic (PLEG): container finished" podID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerID="14558b2b43b12bd6f938bfe33b938c7705b1528f8c8be67e451dfa9069d61fa8" exitCode=0 Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.491436 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3534949-6af7-4bf0-ba36-ed96804ada1b","Type":"ContainerDied","Data":"14558b2b43b12bd6f938bfe33b938c7705b1528f8c8be67e451dfa9069d61fa8"} Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.491953 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d3534949-6af7-4bf0-ba36-ed96804ada1b","Type":"ContainerDied","Data":"ed02ab2736a8a5c0a79a953eecaf3af4a07414cf3f0246382c94a16fdd33f1ab"} Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.491970 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed02ab2736a8a5c0a79a953eecaf3af4a07414cf3f0246382c94a16fdd33f1ab" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.559759 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.668791 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-sg-core-conf-yaml\") pod \"d3534949-6af7-4bf0-ba36-ed96804ada1b\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.668857 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-config-data\") pod \"d3534949-6af7-4bf0-ba36-ed96804ada1b\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.668914 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-combined-ca-bundle\") pod \"d3534949-6af7-4bf0-ba36-ed96804ada1b\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.669041 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-run-httpd\") pod \"d3534949-6af7-4bf0-ba36-ed96804ada1b\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.669166 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcll5\" (UniqueName: \"kubernetes.io/projected/d3534949-6af7-4bf0-ba36-ed96804ada1b-kube-api-access-dcll5\") pod \"d3534949-6af7-4bf0-ba36-ed96804ada1b\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.669239 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-scripts\") pod \"d3534949-6af7-4bf0-ba36-ed96804ada1b\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.669306 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-log-httpd\") pod \"d3534949-6af7-4bf0-ba36-ed96804ada1b\" (UID: \"d3534949-6af7-4bf0-ba36-ed96804ada1b\") " Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.669902 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d3534949-6af7-4bf0-ba36-ed96804ada1b" (UID: "d3534949-6af7-4bf0-ba36-ed96804ada1b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.670563 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.670763 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d3534949-6af7-4bf0-ba36-ed96804ada1b" (UID: "d3534949-6af7-4bf0-ba36-ed96804ada1b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.678884 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3534949-6af7-4bf0-ba36-ed96804ada1b-kube-api-access-dcll5" (OuterVolumeSpecName: "kube-api-access-dcll5") pod "d3534949-6af7-4bf0-ba36-ed96804ada1b" (UID: "d3534949-6af7-4bf0-ba36-ed96804ada1b"). InnerVolumeSpecName "kube-api-access-dcll5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.694382 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-scripts" (OuterVolumeSpecName: "scripts") pod "d3534949-6af7-4bf0-ba36-ed96804ada1b" (UID: "d3534949-6af7-4bf0-ba36-ed96804ada1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.775359 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d3534949-6af7-4bf0-ba36-ed96804ada1b-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.775584 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcll5\" (UniqueName: \"kubernetes.io/projected/d3534949-6af7-4bf0-ba36-ed96804ada1b-kube-api-access-dcll5\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.775648 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.815413 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d3534949-6af7-4bf0-ba36-ed96804ada1b" (UID: "d3534949-6af7-4bf0-ba36-ed96804ada1b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.850341 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3534949-6af7-4bf0-ba36-ed96804ada1b" (UID: "d3534949-6af7-4bf0-ba36-ed96804ada1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.868824 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-config-data" (OuterVolumeSpecName: "config-data") pod "d3534949-6af7-4bf0-ba36-ed96804ada1b" (UID: "d3534949-6af7-4bf0-ba36-ed96804ada1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.877359 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.877499 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:00 crc kubenswrapper[4722]: I0219 19:40:00.877571 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3534949-6af7-4bf0-ba36-ed96804ada1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.509988 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.548370 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.560763 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.578811 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:01 crc kubenswrapper[4722]: E0219 19:40:01.579326 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="ceilometer-notification-agent" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.579352 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="ceilometer-notification-agent" Feb 19 19:40:01 crc kubenswrapper[4722]: E0219 19:40:01.579363 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="proxy-httpd" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.579371 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="proxy-httpd" Feb 19 19:40:01 crc kubenswrapper[4722]: E0219 19:40:01.579408 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="sg-core" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.579417 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="sg-core" Feb 19 19:40:01 crc kubenswrapper[4722]: E0219 19:40:01.579428 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="ceilometer-central-agent" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.579434 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="ceilometer-central-agent" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.579663 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="ceilometer-central-agent" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.579685 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="sg-core" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.579704 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="proxy-httpd" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.579729 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" containerName="ceilometer-notification-agent" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.584368 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.590646 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.590856 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.613820 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.694098 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.694491 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.694738 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-log-httpd\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.694934 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-run-httpd\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.695128 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-scripts\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.695347 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-config-data\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.695498 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7dx9\" (UniqueName: \"kubernetes.io/projected/3e17a08d-48a9-43c6-acd3-5bcc13df91df-kube-api-access-w7dx9\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.796992 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.797032 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.797063 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-log-httpd\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.797122 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-run-httpd\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.797621 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-log-httpd\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.797920 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-run-httpd\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.797965 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-scripts\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.798012 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-config-data\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.798026 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7dx9\" (UniqueName: \"kubernetes.io/projected/3e17a08d-48a9-43c6-acd3-5bcc13df91df-kube-api-access-w7dx9\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.802673 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.802883 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-config-data\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.807957 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.809816 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-scripts\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.817753 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7dx9\" (UniqueName: \"kubernetes.io/projected/3e17a08d-48a9-43c6-acd3-5bcc13df91df-kube-api-access-w7dx9\") pod \"ceilometer-0\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " pod="openstack/ceilometer-0" Feb 19 19:40:01 crc kubenswrapper[4722]: I0219 19:40:01.907123 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:40:02 crc kubenswrapper[4722]: I0219 19:40:02.413033 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:02 crc kubenswrapper[4722]: W0219 19:40:02.415054 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e17a08d_48a9_43c6_acd3_5bcc13df91df.slice/crio-d27549932ec42cb5113394ad965b534e33b29d9f16e5cedbb8c7ea91e1576243 WatchSource:0}: Error finding container d27549932ec42cb5113394ad965b534e33b29d9f16e5cedbb8c7ea91e1576243: Status 404 returned error can't find the container with id d27549932ec42cb5113394ad965b534e33b29d9f16e5cedbb8c7ea91e1576243 Feb 19 19:40:02 crc kubenswrapper[4722]: I0219 19:40:02.527089 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e17a08d-48a9-43c6-acd3-5bcc13df91df","Type":"ContainerStarted","Data":"d27549932ec42cb5113394ad965b534e33b29d9f16e5cedbb8c7ea91e1576243"} Feb 19 19:40:03 crc kubenswrapper[4722]: I0219 19:40:03.083825 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3534949-6af7-4bf0-ba36-ed96804ada1b" path="/var/lib/kubelet/pods/d3534949-6af7-4bf0-ba36-ed96804ada1b/volumes" Feb 19 19:40:03 crc kubenswrapper[4722]: I0219 19:40:03.542542 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e17a08d-48a9-43c6-acd3-5bcc13df91df","Type":"ContainerStarted","Data":"9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84"} Feb 19 19:40:04 crc kubenswrapper[4722]: I0219 19:40:04.567615 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e17a08d-48a9-43c6-acd3-5bcc13df91df","Type":"ContainerStarted","Data":"974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0"} Feb 19 19:40:04 crc kubenswrapper[4722]: I0219 19:40:04.567966 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e17a08d-48a9-43c6-acd3-5bcc13df91df","Type":"ContainerStarted","Data":"94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597"} Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.394720 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-l92p9"] Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.396307 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-l92p9" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.407033 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-l92p9"] Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.495533 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-nq58z"] Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.498255 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nq58z" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.508322 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq5kp\" (UniqueName: \"kubernetes.io/projected/c6e27062-a94f-4d8d-8a07-b940d9aa572e-kube-api-access-dq5kp\") pod \"nova-api-db-create-l92p9\" (UID: \"c6e27062-a94f-4d8d-8a07-b940d9aa572e\") " pod="openstack/nova-api-db-create-l92p9" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.508381 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6e27062-a94f-4d8d-8a07-b940d9aa572e-operator-scripts\") pod \"nova-api-db-create-l92p9\" (UID: \"c6e27062-a94f-4d8d-8a07-b940d9aa572e\") " pod="openstack/nova-api-db-create-l92p9" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.541140 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-nq58z"] Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.614516 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq5kp\" (UniqueName: \"kubernetes.io/projected/c6e27062-a94f-4d8d-8a07-b940d9aa572e-kube-api-access-dq5kp\") pod \"nova-api-db-create-l92p9\" (UID: \"c6e27062-a94f-4d8d-8a07-b940d9aa572e\") " pod="openstack/nova-api-db-create-l92p9" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.614578 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/823fc346-84d0-4920-bc42-ec213d0c6eef-operator-scripts\") pod \"nova-cell0-db-create-nq58z\" (UID: \"823fc346-84d0-4920-bc42-ec213d0c6eef\") " pod="openstack/nova-cell0-db-create-nq58z" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.614622 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5t2n\" (UniqueName: \"kubernetes.io/projected/823fc346-84d0-4920-bc42-ec213d0c6eef-kube-api-access-j5t2n\") pod \"nova-cell0-db-create-nq58z\" (UID: \"823fc346-84d0-4920-bc42-ec213d0c6eef\") " pod="openstack/nova-cell0-db-create-nq58z" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.614650 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6e27062-a94f-4d8d-8a07-b940d9aa572e-operator-scripts\") pod \"nova-api-db-create-l92p9\" (UID: \"c6e27062-a94f-4d8d-8a07-b940d9aa572e\") " pod="openstack/nova-api-db-create-l92p9" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.615645 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6e27062-a94f-4d8d-8a07-b940d9aa572e-operator-scripts\") pod \"nova-api-db-create-l92p9\" (UID: \"c6e27062-a94f-4d8d-8a07-b940d9aa572e\") " pod="openstack/nova-api-db-create-l92p9" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.627529 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-cda6-account-create-update-45ddh"] Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.629144 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cda6-account-create-update-45ddh" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.631598 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.640038 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-cda6-account-create-update-45ddh"] Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.643246 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq5kp\" (UniqueName: \"kubernetes.io/projected/c6e27062-a94f-4d8d-8a07-b940d9aa572e-kube-api-access-dq5kp\") pod \"nova-api-db-create-l92p9\" (UID: \"c6e27062-a94f-4d8d-8a07-b940d9aa572e\") " pod="openstack/nova-api-db-create-l92p9" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.717213 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-4fzxz"] Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.718556 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4fzxz" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.718920 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gvfs\" (UniqueName: \"kubernetes.io/projected/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-kube-api-access-4gvfs\") pod \"nova-api-cda6-account-create-update-45ddh\" (UID: \"5b8ebb77-caea-46ca-8989-d2dd37bf2df5\") " pod="openstack/nova-api-cda6-account-create-update-45ddh" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.719006 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-operator-scripts\") pod \"nova-api-cda6-account-create-update-45ddh\" (UID: \"5b8ebb77-caea-46ca-8989-d2dd37bf2df5\") " pod="openstack/nova-api-cda6-account-create-update-45ddh" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.721752 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/823fc346-84d0-4920-bc42-ec213d0c6eef-operator-scripts\") pod \"nova-cell0-db-create-nq58z\" (UID: \"823fc346-84d0-4920-bc42-ec213d0c6eef\") " pod="openstack/nova-cell0-db-create-nq58z" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.721821 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5t2n\" (UniqueName: \"kubernetes.io/projected/823fc346-84d0-4920-bc42-ec213d0c6eef-kube-api-access-j5t2n\") pod \"nova-cell0-db-create-nq58z\" (UID: \"823fc346-84d0-4920-bc42-ec213d0c6eef\") " pod="openstack/nova-cell0-db-create-nq58z" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.722938 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/823fc346-84d0-4920-bc42-ec213d0c6eef-operator-scripts\") pod \"nova-cell0-db-create-nq58z\" (UID: \"823fc346-84d0-4920-bc42-ec213d0c6eef\") " pod="openstack/nova-cell0-db-create-nq58z" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.734139 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4fzxz"] Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.757639 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5t2n\" (UniqueName: \"kubernetes.io/projected/823fc346-84d0-4920-bc42-ec213d0c6eef-kube-api-access-j5t2n\") pod \"nova-cell0-db-create-nq58z\" (UID: \"823fc346-84d0-4920-bc42-ec213d0c6eef\") " pod="openstack/nova-cell0-db-create-nq58z" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.820518 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-b095-account-create-update-d2ffx"] Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.822183 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b095-account-create-update-d2ffx" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.824534 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqt6l\" (UniqueName: \"kubernetes.io/projected/84699ef3-8d21-4493-8875-81de167ee617-kube-api-access-kqt6l\") pod \"nova-cell1-db-create-4fzxz\" (UID: \"84699ef3-8d21-4493-8875-81de167ee617\") " pod="openstack/nova-cell1-db-create-4fzxz" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.824687 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gvfs\" (UniqueName: \"kubernetes.io/projected/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-kube-api-access-4gvfs\") pod \"nova-api-cda6-account-create-update-45ddh\" (UID: \"5b8ebb77-caea-46ca-8989-d2dd37bf2df5\") " pod="openstack/nova-api-cda6-account-create-update-45ddh" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.824760 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-operator-scripts\") pod \"nova-api-cda6-account-create-update-45ddh\" (UID: \"5b8ebb77-caea-46ca-8989-d2dd37bf2df5\") " pod="openstack/nova-api-cda6-account-create-update-45ddh" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.824898 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84699ef3-8d21-4493-8875-81de167ee617-operator-scripts\") pod \"nova-cell1-db-create-4fzxz\" (UID: \"84699ef3-8d21-4493-8875-81de167ee617\") " pod="openstack/nova-cell1-db-create-4fzxz" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.826651 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.845745 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-l92p9" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.847045 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gvfs\" (UniqueName: \"kubernetes.io/projected/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-kube-api-access-4gvfs\") pod \"nova-api-cda6-account-create-update-45ddh\" (UID: \"5b8ebb77-caea-46ca-8989-d2dd37bf2df5\") " pod="openstack/nova-api-cda6-account-create-update-45ddh" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.848302 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-operator-scripts\") pod \"nova-api-cda6-account-create-update-45ddh\" (UID: \"5b8ebb77-caea-46ca-8989-d2dd37bf2df5\") " pod="openstack/nova-api-cda6-account-create-update-45ddh" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.850654 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b095-account-create-update-d2ffx"] Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.873873 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nq58z" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.933727 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84699ef3-8d21-4493-8875-81de167ee617-operator-scripts\") pod \"nova-cell1-db-create-4fzxz\" (UID: \"84699ef3-8d21-4493-8875-81de167ee617\") " pod="openstack/nova-cell1-db-create-4fzxz" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.933801 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-operator-scripts\") pod \"nova-cell0-b095-account-create-update-d2ffx\" (UID: \"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c\") " pod="openstack/nova-cell0-b095-account-create-update-d2ffx" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.933949 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ptb2\" (UniqueName: \"kubernetes.io/projected/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-kube-api-access-7ptb2\") pod \"nova-cell0-b095-account-create-update-d2ffx\" (UID: \"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c\") " pod="openstack/nova-cell0-b095-account-create-update-d2ffx" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.934044 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqt6l\" (UniqueName: \"kubernetes.io/projected/84699ef3-8d21-4493-8875-81de167ee617-kube-api-access-kqt6l\") pod \"nova-cell1-db-create-4fzxz\" (UID: \"84699ef3-8d21-4493-8875-81de167ee617\") " pod="openstack/nova-cell1-db-create-4fzxz" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.935926 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84699ef3-8d21-4493-8875-81de167ee617-operator-scripts\") pod \"nova-cell1-db-create-4fzxz\" (UID: \"84699ef3-8d21-4493-8875-81de167ee617\") " pod="openstack/nova-cell1-db-create-4fzxz" Feb 19 19:40:06 crc kubenswrapper[4722]: I0219 19:40:06.990470 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqt6l\" (UniqueName: \"kubernetes.io/projected/84699ef3-8d21-4493-8875-81de167ee617-kube-api-access-kqt6l\") pod \"nova-cell1-db-create-4fzxz\" (UID: \"84699ef3-8d21-4493-8875-81de167ee617\") " pod="openstack/nova-cell1-db-create-4fzxz" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.024401 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cda6-account-create-update-45ddh" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.035946 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-operator-scripts\") pod \"nova-cell0-b095-account-create-update-d2ffx\" (UID: \"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c\") " pod="openstack/nova-cell0-b095-account-create-update-d2ffx" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.036047 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ptb2\" (UniqueName: \"kubernetes.io/projected/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-kube-api-access-7ptb2\") pod \"nova-cell0-b095-account-create-update-d2ffx\" (UID: \"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c\") " pod="openstack/nova-cell0-b095-account-create-update-d2ffx" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.036936 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-operator-scripts\") pod \"nova-cell0-b095-account-create-update-d2ffx\" (UID: \"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c\") " pod="openstack/nova-cell0-b095-account-create-update-d2ffx" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.057670 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ptb2\" (UniqueName: \"kubernetes.io/projected/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-kube-api-access-7ptb2\") pod \"nova-cell0-b095-account-create-update-d2ffx\" (UID: \"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c\") " pod="openstack/nova-cell0-b095-account-create-update-d2ffx" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.059054 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4fzxz" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.061968 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-bc61-account-create-update-km828"] Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.067434 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bc61-account-create-update-km828" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.070069 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.091589 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-bc61-account-create-update-km828"] Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.152617 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b095-account-create-update-d2ffx" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.241362 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zxdf\" (UniqueName: \"kubernetes.io/projected/3f262eb9-64a7-4b10-85f9-4bc43d512f60-kube-api-access-4zxdf\") pod \"nova-cell1-bc61-account-create-update-km828\" (UID: \"3f262eb9-64a7-4b10-85f9-4bc43d512f60\") " pod="openstack/nova-cell1-bc61-account-create-update-km828" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.241542 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f262eb9-64a7-4b10-85f9-4bc43d512f60-operator-scripts\") pod \"nova-cell1-bc61-account-create-update-km828\" (UID: \"3f262eb9-64a7-4b10-85f9-4bc43d512f60\") " pod="openstack/nova-cell1-bc61-account-create-update-km828" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.343916 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zxdf\" (UniqueName: \"kubernetes.io/projected/3f262eb9-64a7-4b10-85f9-4bc43d512f60-kube-api-access-4zxdf\") pod \"nova-cell1-bc61-account-create-update-km828\" (UID: \"3f262eb9-64a7-4b10-85f9-4bc43d512f60\") " pod="openstack/nova-cell1-bc61-account-create-update-km828" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.344043 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f262eb9-64a7-4b10-85f9-4bc43d512f60-operator-scripts\") pod \"nova-cell1-bc61-account-create-update-km828\" (UID: \"3f262eb9-64a7-4b10-85f9-4bc43d512f60\") " pod="openstack/nova-cell1-bc61-account-create-update-km828" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.344859 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f262eb9-64a7-4b10-85f9-4bc43d512f60-operator-scripts\") pod \"nova-cell1-bc61-account-create-update-km828\" (UID: \"3f262eb9-64a7-4b10-85f9-4bc43d512f60\") " pod="openstack/nova-cell1-bc61-account-create-update-km828" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.383286 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zxdf\" (UniqueName: \"kubernetes.io/projected/3f262eb9-64a7-4b10-85f9-4bc43d512f60-kube-api-access-4zxdf\") pod \"nova-cell1-bc61-account-create-update-km828\" (UID: \"3f262eb9-64a7-4b10-85f9-4bc43d512f60\") " pod="openstack/nova-cell1-bc61-account-create-update-km828" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.395043 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bc61-account-create-update-km828" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.527300 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-nq58z"] Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.675217 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e17a08d-48a9-43c6-acd3-5bcc13df91df","Type":"ContainerStarted","Data":"d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4"} Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.676781 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.692744 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nq58z" event={"ID":"823fc346-84d0-4920-bc42-ec213d0c6eef","Type":"ContainerStarted","Data":"9321baf0393d59bba73cb4fc60396e56bf1d2d8783d6fe2b8d651c63240d3d1c"} Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.699624 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-l92p9"] Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.730672 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.814398166 podStartE2EDuration="6.730649033s" podCreationTimestamp="2026-02-19 19:40:01 +0000 UTC" firstStartedPulling="2026-02-19 19:40:02.417527811 +0000 UTC m=+1302.029878135" lastFinishedPulling="2026-02-19 19:40:06.333778678 +0000 UTC m=+1305.946129002" observedRunningTime="2026-02-19 19:40:07.713670354 +0000 UTC m=+1307.326020678" watchObservedRunningTime="2026-02-19 19:40:07.730649033 +0000 UTC m=+1307.342999357" Feb 19 19:40:07 crc kubenswrapper[4722]: I0219 19:40:07.978713 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4fzxz"] Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.150873 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b095-account-create-update-d2ffx"] Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.161519 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-bc61-account-create-update-km828"] Feb 19 19:40:08 crc kubenswrapper[4722]: W0219 19:40:08.221814 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a72e03c_87f6_4d54_8ea1_f8abed33bd2c.slice/crio-7e269cf88b3583a79b8ba5b9d10ca20db17b539a0655c4f93e4676bbe99a4d75 WatchSource:0}: Error finding container 7e269cf88b3583a79b8ba5b9d10ca20db17b539a0655c4f93e4676bbe99a4d75: Status 404 returned error can't find the container with id 7e269cf88b3583a79b8ba5b9d10ca20db17b539a0655c4f93e4676bbe99a4d75 Feb 19 19:40:08 crc kubenswrapper[4722]: W0219 19:40:08.269437 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f262eb9_64a7_4b10_85f9_4bc43d512f60.slice/crio-3f7420aa9b0966db7156e47735009b25bdb84305552dfdf8f5ad0a48bfb3382d WatchSource:0}: Error finding container 3f7420aa9b0966db7156e47735009b25bdb84305552dfdf8f5ad0a48bfb3382d: Status 404 returned error can't find the container with id 3f7420aa9b0966db7156e47735009b25bdb84305552dfdf8f5ad0a48bfb3382d Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.358990 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-cda6-account-create-update-45ddh"] Feb 19 19:40:08 crc kubenswrapper[4722]: W0219 19:40:08.367950 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b8ebb77_caea_46ca_8989_d2dd37bf2df5.slice/crio-d76bb34ad881021c8776e66d30d9c0a04fad76ee27ff7b0d98ead7c346623255 WatchSource:0}: Error finding container d76bb34ad881021c8776e66d30d9c0a04fad76ee27ff7b0d98ead7c346623255: Status 404 returned error can't find the container with id d76bb34ad881021c8776e66d30d9c0a04fad76ee27ff7b0d98ead7c346623255 Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.708714 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cda6-account-create-update-45ddh" event={"ID":"5b8ebb77-caea-46ca-8989-d2dd37bf2df5","Type":"ContainerStarted","Data":"bcf1c97e5c8d595576441c4adc1b3414c50f70e142078a59a013a524b3fc5783"} Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.709115 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cda6-account-create-update-45ddh" event={"ID":"5b8ebb77-caea-46ca-8989-d2dd37bf2df5","Type":"ContainerStarted","Data":"d76bb34ad881021c8776e66d30d9c0a04fad76ee27ff7b0d98ead7c346623255"} Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.711215 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bc61-account-create-update-km828" event={"ID":"3f262eb9-64a7-4b10-85f9-4bc43d512f60","Type":"ContainerStarted","Data":"a8d182f3ca75056fc67eb781e9901b0b7fa4501055d209f0f02c035090c589a3"} Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.711256 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bc61-account-create-update-km828" event={"ID":"3f262eb9-64a7-4b10-85f9-4bc43d512f60","Type":"ContainerStarted","Data":"3f7420aa9b0966db7156e47735009b25bdb84305552dfdf8f5ad0a48bfb3382d"} Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.715309 4722 generic.go:334] "Generic (PLEG): container finished" podID="84699ef3-8d21-4493-8875-81de167ee617" containerID="02fdf9891e0a4a5e6a9cd6279f1ac5170d3eaad2e2904682a600a6d410fb2a19" exitCode=0 Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.715732 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4fzxz" event={"ID":"84699ef3-8d21-4493-8875-81de167ee617","Type":"ContainerDied","Data":"02fdf9891e0a4a5e6a9cd6279f1ac5170d3eaad2e2904682a600a6d410fb2a19"} Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.715991 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4fzxz" event={"ID":"84699ef3-8d21-4493-8875-81de167ee617","Type":"ContainerStarted","Data":"6429e62e2e41653fc6362e37e869fbccbb59b4e67585320b1880dd9be47080f2"} Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.719602 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b095-account-create-update-d2ffx" event={"ID":"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c","Type":"ContainerStarted","Data":"7e269cf88b3583a79b8ba5b9d10ca20db17b539a0655c4f93e4676bbe99a4d75"} Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.726101 4722 generic.go:334] "Generic (PLEG): container finished" podID="823fc346-84d0-4920-bc42-ec213d0c6eef" containerID="34ce6fe937d88e617e83f04f4163bf9713e6cac4114d5734077d30be33461dbc" exitCode=0 Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.726428 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nq58z" event={"ID":"823fc346-84d0-4920-bc42-ec213d0c6eef","Type":"ContainerDied","Data":"34ce6fe937d88e617e83f04f4163bf9713e6cac4114d5734077d30be33461dbc"} Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.730134 4722 generic.go:334] "Generic (PLEG): container finished" podID="c6e27062-a94f-4d8d-8a07-b940d9aa572e" containerID="59b7ab3b9b5c89b55e17c8616e639ea24cc02e1ca89d3d887ff255092c310b2a" exitCode=0 Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.730985 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-l92p9" event={"ID":"c6e27062-a94f-4d8d-8a07-b940d9aa572e","Type":"ContainerDied","Data":"59b7ab3b9b5c89b55e17c8616e639ea24cc02e1ca89d3d887ff255092c310b2a"} Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.731019 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-l92p9" event={"ID":"c6e27062-a94f-4d8d-8a07-b940d9aa572e","Type":"ContainerStarted","Data":"3111868eed1037ee898be61210727fc20dfdcb07c463168784d2422bc46d76bc"} Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.744606 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-cda6-account-create-update-45ddh" podStartSLOduration=2.744583622 podStartE2EDuration="2.744583622s" podCreationTimestamp="2026-02-19 19:40:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:40:08.72554052 +0000 UTC m=+1308.337890854" watchObservedRunningTime="2026-02-19 19:40:08.744583622 +0000 UTC m=+1308.356933946" Feb 19 19:40:08 crc kubenswrapper[4722]: I0219 19:40:08.790890 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-bc61-account-create-update-km828" podStartSLOduration=2.790865372 podStartE2EDuration="2.790865372s" podCreationTimestamp="2026-02-19 19:40:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:40:08.775071401 +0000 UTC m=+1308.387421745" watchObservedRunningTime="2026-02-19 19:40:08.790865372 +0000 UTC m=+1308.403215706" Feb 19 19:40:09 crc kubenswrapper[4722]: I0219 19:40:09.742815 4722 generic.go:334] "Generic (PLEG): container finished" podID="3f262eb9-64a7-4b10-85f9-4bc43d512f60" containerID="a8d182f3ca75056fc67eb781e9901b0b7fa4501055d209f0f02c035090c589a3" exitCode=0 Feb 19 19:40:09 crc kubenswrapper[4722]: I0219 19:40:09.742889 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bc61-account-create-update-km828" event={"ID":"3f262eb9-64a7-4b10-85f9-4bc43d512f60","Type":"ContainerDied","Data":"a8d182f3ca75056fc67eb781e9901b0b7fa4501055d209f0f02c035090c589a3"} Feb 19 19:40:09 crc kubenswrapper[4722]: I0219 19:40:09.744858 4722 generic.go:334] "Generic (PLEG): container finished" podID="6a72e03c-87f6-4d54-8ea1-f8abed33bd2c" containerID="0c662d869f0260b21b14e815b1c26ef3d995bd4318e89a8c8d85dd5703eaa89e" exitCode=0 Feb 19 19:40:09 crc kubenswrapper[4722]: I0219 19:40:09.744928 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b095-account-create-update-d2ffx" event={"ID":"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c","Type":"ContainerDied","Data":"0c662d869f0260b21b14e815b1c26ef3d995bd4318e89a8c8d85dd5703eaa89e"} Feb 19 19:40:09 crc kubenswrapper[4722]: I0219 19:40:09.746560 4722 generic.go:334] "Generic (PLEG): container finished" podID="5b8ebb77-caea-46ca-8989-d2dd37bf2df5" containerID="bcf1c97e5c8d595576441c4adc1b3414c50f70e142078a59a013a524b3fc5783" exitCode=0 Feb 19 19:40:09 crc kubenswrapper[4722]: I0219 19:40:09.746672 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cda6-account-create-update-45ddh" event={"ID":"5b8ebb77-caea-46ca-8989-d2dd37bf2df5","Type":"ContainerDied","Data":"bcf1c97e5c8d595576441c4adc1b3414c50f70e142078a59a013a524b3fc5783"} Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.310551 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4fzxz" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.427336 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84699ef3-8d21-4493-8875-81de167ee617-operator-scripts\") pod \"84699ef3-8d21-4493-8875-81de167ee617\" (UID: \"84699ef3-8d21-4493-8875-81de167ee617\") " Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.427511 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqt6l\" (UniqueName: \"kubernetes.io/projected/84699ef3-8d21-4493-8875-81de167ee617-kube-api-access-kqt6l\") pod \"84699ef3-8d21-4493-8875-81de167ee617\" (UID: \"84699ef3-8d21-4493-8875-81de167ee617\") " Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.427993 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84699ef3-8d21-4493-8875-81de167ee617-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84699ef3-8d21-4493-8875-81de167ee617" (UID: "84699ef3-8d21-4493-8875-81de167ee617"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.428238 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84699ef3-8d21-4493-8875-81de167ee617-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.435393 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84699ef3-8d21-4493-8875-81de167ee617-kube-api-access-kqt6l" (OuterVolumeSpecName: "kube-api-access-kqt6l") pod "84699ef3-8d21-4493-8875-81de167ee617" (UID: "84699ef3-8d21-4493-8875-81de167ee617"). InnerVolumeSpecName "kube-api-access-kqt6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.487513 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nq58z" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.491620 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-l92p9" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.529210 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/823fc346-84d0-4920-bc42-ec213d0c6eef-operator-scripts\") pod \"823fc346-84d0-4920-bc42-ec213d0c6eef\" (UID: \"823fc346-84d0-4920-bc42-ec213d0c6eef\") " Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.529621 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq5kp\" (UniqueName: \"kubernetes.io/projected/c6e27062-a94f-4d8d-8a07-b940d9aa572e-kube-api-access-dq5kp\") pod \"c6e27062-a94f-4d8d-8a07-b940d9aa572e\" (UID: \"c6e27062-a94f-4d8d-8a07-b940d9aa572e\") " Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.529664 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5t2n\" (UniqueName: \"kubernetes.io/projected/823fc346-84d0-4920-bc42-ec213d0c6eef-kube-api-access-j5t2n\") pod \"823fc346-84d0-4920-bc42-ec213d0c6eef\" (UID: \"823fc346-84d0-4920-bc42-ec213d0c6eef\") " Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.529689 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/823fc346-84d0-4920-bc42-ec213d0c6eef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "823fc346-84d0-4920-bc42-ec213d0c6eef" (UID: "823fc346-84d0-4920-bc42-ec213d0c6eef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.529721 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6e27062-a94f-4d8d-8a07-b940d9aa572e-operator-scripts\") pod \"c6e27062-a94f-4d8d-8a07-b940d9aa572e\" (UID: \"c6e27062-a94f-4d8d-8a07-b940d9aa572e\") " Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.530200 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6e27062-a94f-4d8d-8a07-b940d9aa572e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c6e27062-a94f-4d8d-8a07-b940d9aa572e" (UID: "c6e27062-a94f-4d8d-8a07-b940d9aa572e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.530270 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqt6l\" (UniqueName: \"kubernetes.io/projected/84699ef3-8d21-4493-8875-81de167ee617-kube-api-access-kqt6l\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.530285 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/823fc346-84d0-4920-bc42-ec213d0c6eef-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.533347 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6e27062-a94f-4d8d-8a07-b940d9aa572e-kube-api-access-dq5kp" (OuterVolumeSpecName: "kube-api-access-dq5kp") pod "c6e27062-a94f-4d8d-8a07-b940d9aa572e" (UID: "c6e27062-a94f-4d8d-8a07-b940d9aa572e"). InnerVolumeSpecName "kube-api-access-dq5kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.535872 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/823fc346-84d0-4920-bc42-ec213d0c6eef-kube-api-access-j5t2n" (OuterVolumeSpecName: "kube-api-access-j5t2n") pod "823fc346-84d0-4920-bc42-ec213d0c6eef" (UID: "823fc346-84d0-4920-bc42-ec213d0c6eef"). InnerVolumeSpecName "kube-api-access-j5t2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.632044 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq5kp\" (UniqueName: \"kubernetes.io/projected/c6e27062-a94f-4d8d-8a07-b940d9aa572e-kube-api-access-dq5kp\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.632095 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5t2n\" (UniqueName: \"kubernetes.io/projected/823fc346-84d0-4920-bc42-ec213d0c6eef-kube-api-access-j5t2n\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.632109 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6e27062-a94f-4d8d-8a07-b940d9aa572e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.756460 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4fzxz" event={"ID":"84699ef3-8d21-4493-8875-81de167ee617","Type":"ContainerDied","Data":"6429e62e2e41653fc6362e37e869fbccbb59b4e67585320b1880dd9be47080f2"} Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.756497 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6429e62e2e41653fc6362e37e869fbccbb59b4e67585320b1880dd9be47080f2" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.756503 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4fzxz" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.757942 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nq58z" event={"ID":"823fc346-84d0-4920-bc42-ec213d0c6eef","Type":"ContainerDied","Data":"9321baf0393d59bba73cb4fc60396e56bf1d2d8783d6fe2b8d651c63240d3d1c"} Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.757978 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9321baf0393d59bba73cb4fc60396e56bf1d2d8783d6fe2b8d651c63240d3d1c" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.757950 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nq58z" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.759607 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-l92p9" event={"ID":"c6e27062-a94f-4d8d-8a07-b940d9aa572e","Type":"ContainerDied","Data":"3111868eed1037ee898be61210727fc20dfdcb07c463168784d2422bc46d76bc"} Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.759632 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3111868eed1037ee898be61210727fc20dfdcb07c463168784d2422bc46d76bc" Feb 19 19:40:10 crc kubenswrapper[4722]: I0219 19:40:10.759679 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-l92p9" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.152850 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bc61-account-create-update-km828" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.244385 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zxdf\" (UniqueName: \"kubernetes.io/projected/3f262eb9-64a7-4b10-85f9-4bc43d512f60-kube-api-access-4zxdf\") pod \"3f262eb9-64a7-4b10-85f9-4bc43d512f60\" (UID: \"3f262eb9-64a7-4b10-85f9-4bc43d512f60\") " Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.244824 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f262eb9-64a7-4b10-85f9-4bc43d512f60-operator-scripts\") pod \"3f262eb9-64a7-4b10-85f9-4bc43d512f60\" (UID: \"3f262eb9-64a7-4b10-85f9-4bc43d512f60\") " Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.245656 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f262eb9-64a7-4b10-85f9-4bc43d512f60-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3f262eb9-64a7-4b10-85f9-4bc43d512f60" (UID: "3f262eb9-64a7-4b10-85f9-4bc43d512f60"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.270631 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f262eb9-64a7-4b10-85f9-4bc43d512f60-kube-api-access-4zxdf" (OuterVolumeSpecName: "kube-api-access-4zxdf") pod "3f262eb9-64a7-4b10-85f9-4bc43d512f60" (UID: "3f262eb9-64a7-4b10-85f9-4bc43d512f60"). InnerVolumeSpecName "kube-api-access-4zxdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.347606 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f262eb9-64a7-4b10-85f9-4bc43d512f60-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.347895 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zxdf\" (UniqueName: \"kubernetes.io/projected/3f262eb9-64a7-4b10-85f9-4bc43d512f60-kube-api-access-4zxdf\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.380049 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b095-account-create-update-d2ffx" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.386962 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cda6-account-create-update-45ddh" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.448630 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-operator-scripts\") pod \"5b8ebb77-caea-46ca-8989-d2dd37bf2df5\" (UID: \"5b8ebb77-caea-46ca-8989-d2dd37bf2df5\") " Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.448713 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ptb2\" (UniqueName: \"kubernetes.io/projected/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-kube-api-access-7ptb2\") pod \"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c\" (UID: \"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c\") " Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.448793 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-operator-scripts\") pod \"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c\" (UID: \"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c\") " Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.448843 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gvfs\" (UniqueName: \"kubernetes.io/projected/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-kube-api-access-4gvfs\") pod \"5b8ebb77-caea-46ca-8989-d2dd37bf2df5\" (UID: \"5b8ebb77-caea-46ca-8989-d2dd37bf2df5\") " Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.449768 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6a72e03c-87f6-4d54-8ea1-f8abed33bd2c" (UID: "6a72e03c-87f6-4d54-8ea1-f8abed33bd2c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.449842 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b8ebb77-caea-46ca-8989-d2dd37bf2df5" (UID: "5b8ebb77-caea-46ca-8989-d2dd37bf2df5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.452551 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-kube-api-access-4gvfs" (OuterVolumeSpecName: "kube-api-access-4gvfs") pod "5b8ebb77-caea-46ca-8989-d2dd37bf2df5" (UID: "5b8ebb77-caea-46ca-8989-d2dd37bf2df5"). InnerVolumeSpecName "kube-api-access-4gvfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.453846 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-kube-api-access-7ptb2" (OuterVolumeSpecName: "kube-api-access-7ptb2") pod "6a72e03c-87f6-4d54-8ea1-f8abed33bd2c" (UID: "6a72e03c-87f6-4d54-8ea1-f8abed33bd2c"). InnerVolumeSpecName "kube-api-access-7ptb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.551572 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.551623 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ptb2\" (UniqueName: \"kubernetes.io/projected/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-kube-api-access-7ptb2\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.551637 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.551648 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gvfs\" (UniqueName: \"kubernetes.io/projected/5b8ebb77-caea-46ca-8989-d2dd37bf2df5-kube-api-access-4gvfs\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.782704 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cda6-account-create-update-45ddh" event={"ID":"5b8ebb77-caea-46ca-8989-d2dd37bf2df5","Type":"ContainerDied","Data":"d76bb34ad881021c8776e66d30d9c0a04fad76ee27ff7b0d98ead7c346623255"} Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.782745 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d76bb34ad881021c8776e66d30d9c0a04fad76ee27ff7b0d98ead7c346623255" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.782870 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cda6-account-create-update-45ddh" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.785625 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-bc61-account-create-update-km828" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.785899 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-bc61-account-create-update-km828" event={"ID":"3f262eb9-64a7-4b10-85f9-4bc43d512f60","Type":"ContainerDied","Data":"3f7420aa9b0966db7156e47735009b25bdb84305552dfdf8f5ad0a48bfb3382d"} Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.785940 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f7420aa9b0966db7156e47735009b25bdb84305552dfdf8f5ad0a48bfb3382d" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.788581 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b095-account-create-update-d2ffx" event={"ID":"6a72e03c-87f6-4d54-8ea1-f8abed33bd2c","Type":"ContainerDied","Data":"7e269cf88b3583a79b8ba5b9d10ca20db17b539a0655c4f93e4676bbe99a4d75"} Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.788614 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e269cf88b3583a79b8ba5b9d10ca20db17b539a0655c4f93e4676bbe99a4d75" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.788720 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b095-account-create-update-d2ffx" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.811449 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.167:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 19:40:11 crc kubenswrapper[4722]: I0219 19:40:11.811820 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="74c5d98f-45b4-4fd8-876b-3471da720a4b" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.167:9292/healthcheck\": dial tcp 10.217.0.167:9292: i/o timeout" Feb 19 19:40:12 crc kubenswrapper[4722]: E0219 19:40:12.034240 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b8ebb77_caea_46ca_8989_d2dd37bf2df5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a72e03c_87f6_4d54_8ea1_f8abed33bd2c.slice\": RecentStats: unable to find data in memory cache]" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.196574 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cddxh"] Feb 19 19:40:17 crc kubenswrapper[4722]: E0219 19:40:17.200808 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84699ef3-8d21-4493-8875-81de167ee617" containerName="mariadb-database-create" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.200824 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="84699ef3-8d21-4493-8875-81de167ee617" containerName="mariadb-database-create" Feb 19 19:40:17 crc kubenswrapper[4722]: E0219 19:40:17.200834 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e27062-a94f-4d8d-8a07-b940d9aa572e" containerName="mariadb-database-create" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.200840 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e27062-a94f-4d8d-8a07-b940d9aa572e" containerName="mariadb-database-create" Feb 19 19:40:17 crc kubenswrapper[4722]: E0219 19:40:17.200854 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b8ebb77-caea-46ca-8989-d2dd37bf2df5" containerName="mariadb-account-create-update" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.200860 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b8ebb77-caea-46ca-8989-d2dd37bf2df5" containerName="mariadb-account-create-update" Feb 19 19:40:17 crc kubenswrapper[4722]: E0219 19:40:17.200871 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="823fc346-84d0-4920-bc42-ec213d0c6eef" containerName="mariadb-database-create" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.200876 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="823fc346-84d0-4920-bc42-ec213d0c6eef" containerName="mariadb-database-create" Feb 19 19:40:17 crc kubenswrapper[4722]: E0219 19:40:17.200888 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f262eb9-64a7-4b10-85f9-4bc43d512f60" containerName="mariadb-account-create-update" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.200895 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f262eb9-64a7-4b10-85f9-4bc43d512f60" containerName="mariadb-account-create-update" Feb 19 19:40:17 crc kubenswrapper[4722]: E0219 19:40:17.200905 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a72e03c-87f6-4d54-8ea1-f8abed33bd2c" containerName="mariadb-account-create-update" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.200910 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a72e03c-87f6-4d54-8ea1-f8abed33bd2c" containerName="mariadb-account-create-update" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.201097 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="84699ef3-8d21-4493-8875-81de167ee617" containerName="mariadb-database-create" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.201106 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a72e03c-87f6-4d54-8ea1-f8abed33bd2c" containerName="mariadb-account-create-update" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.201121 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b8ebb77-caea-46ca-8989-d2dd37bf2df5" containerName="mariadb-account-create-update" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.201132 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="823fc346-84d0-4920-bc42-ec213d0c6eef" containerName="mariadb-database-create" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.201145 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f262eb9-64a7-4b10-85f9-4bc43d512f60" containerName="mariadb-account-create-update" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.201172 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e27062-a94f-4d8d-8a07-b940d9aa572e" containerName="mariadb-database-create" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.201918 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.204927 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6spgl" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.205297 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.207647 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.217611 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cddxh"] Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.229122 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj9lg\" (UniqueName: \"kubernetes.io/projected/e2859f56-714b-43b5-bb67-6ee5493d4f11-kube-api-access-jj9lg\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.229347 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.229422 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-config-data\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.229483 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-scripts\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.331499 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.331591 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-config-data\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.331636 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-scripts\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.331686 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj9lg\" (UniqueName: \"kubernetes.io/projected/e2859f56-714b-43b5-bb67-6ee5493d4f11-kube-api-access-jj9lg\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.337001 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-scripts\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.337838 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.344963 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-config-data\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.350533 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj9lg\" (UniqueName: \"kubernetes.io/projected/e2859f56-714b-43b5-bb67-6ee5493d4f11-kube-api-access-jj9lg\") pod \"nova-cell0-conductor-db-sync-cddxh\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:17 crc kubenswrapper[4722]: I0219 19:40:17.523635 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:18 crc kubenswrapper[4722]: I0219 19:40:18.184224 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cddxh"] Feb 19 19:40:18 crc kubenswrapper[4722]: W0219 19:40:18.199395 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2859f56_714b_43b5_bb67_6ee5493d4f11.slice/crio-6c95896fdbaccccb05ee2975e96d928a878ac228bdcc2d7bc3031a24ff6279cb WatchSource:0}: Error finding container 6c95896fdbaccccb05ee2975e96d928a878ac228bdcc2d7bc3031a24ff6279cb: Status 404 returned error can't find the container with id 6c95896fdbaccccb05ee2975e96d928a878ac228bdcc2d7bc3031a24ff6279cb Feb 19 19:40:18 crc kubenswrapper[4722]: I0219 19:40:18.863742 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cddxh" event={"ID":"e2859f56-714b-43b5-bb67-6ee5493d4f11","Type":"ContainerStarted","Data":"6c95896fdbaccccb05ee2975e96d928a878ac228bdcc2d7bc3031a24ff6279cb"} Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.016086 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.016981 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="ceilometer-central-agent" containerID="cri-o://9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84" gracePeriod=30 Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.017734 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="ceilometer-notification-agent" containerID="cri-o://94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597" gracePeriod=30 Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.017788 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="proxy-httpd" containerID="cri-o://d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4" gracePeriod=30 Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.017886 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="sg-core" containerID="cri-o://974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0" gracePeriod=30 Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.026647 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.199:3000/\": EOF" Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.921795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e17a08d-48a9-43c6-acd3-5bcc13df91df","Type":"ContainerDied","Data":"d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4"} Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.921814 4722 generic.go:334] "Generic (PLEG): container finished" podID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerID="d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4" exitCode=0 Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.922190 4722 generic.go:334] "Generic (PLEG): container finished" podID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerID="974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0" exitCode=2 Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.922204 4722 generic.go:334] "Generic (PLEG): container finished" podID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerID="9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84" exitCode=0 Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.922219 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e17a08d-48a9-43c6-acd3-5bcc13df91df","Type":"ContainerDied","Data":"974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0"} Feb 19 19:40:24 crc kubenswrapper[4722]: I0219 19:40:24.922233 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e17a08d-48a9-43c6-acd3-5bcc13df91df","Type":"ContainerDied","Data":"9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84"} Feb 19 19:40:25 crc kubenswrapper[4722]: I0219 19:40:25.932139 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cddxh" event={"ID":"e2859f56-714b-43b5-bb67-6ee5493d4f11","Type":"ContainerStarted","Data":"0edcf275740c511d92faf25dcc6aa827af0e172da4743fa7292bb01babbbeb7e"} Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.554566 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.591017 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-cddxh" podStartSLOduration=2.9299823849999997 podStartE2EDuration="9.590992917s" podCreationTimestamp="2026-02-19 19:40:17 +0000 UTC" firstStartedPulling="2026-02-19 19:40:18.202261006 +0000 UTC m=+1317.814611330" lastFinishedPulling="2026-02-19 19:40:24.863271538 +0000 UTC m=+1324.475621862" observedRunningTime="2026-02-19 19:40:25.956619409 +0000 UTC m=+1325.568969753" watchObservedRunningTime="2026-02-19 19:40:26.590992917 +0000 UTC m=+1326.203343241" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.630417 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-config-data\") pod \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.630628 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-sg-core-conf-yaml\") pod \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.630772 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7dx9\" (UniqueName: \"kubernetes.io/projected/3e17a08d-48a9-43c6-acd3-5bcc13df91df-kube-api-access-w7dx9\") pod \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.630850 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-combined-ca-bundle\") pod \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.630978 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-run-httpd\") pod \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.631029 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-log-httpd\") pod \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.631128 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-scripts\") pod \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\" (UID: \"3e17a08d-48a9-43c6-acd3-5bcc13df91df\") " Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.632929 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3e17a08d-48a9-43c6-acd3-5bcc13df91df" (UID: "3e17a08d-48a9-43c6-acd3-5bcc13df91df"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.633323 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3e17a08d-48a9-43c6-acd3-5bcc13df91df" (UID: "3e17a08d-48a9-43c6-acd3-5bcc13df91df"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.646328 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-scripts" (OuterVolumeSpecName: "scripts") pod "3e17a08d-48a9-43c6-acd3-5bcc13df91df" (UID: "3e17a08d-48a9-43c6-acd3-5bcc13df91df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.646495 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e17a08d-48a9-43c6-acd3-5bcc13df91df-kube-api-access-w7dx9" (OuterVolumeSpecName: "kube-api-access-w7dx9") pod "3e17a08d-48a9-43c6-acd3-5bcc13df91df" (UID: "3e17a08d-48a9-43c6-acd3-5bcc13df91df"). InnerVolumeSpecName "kube-api-access-w7dx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.676721 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3e17a08d-48a9-43c6-acd3-5bcc13df91df" (UID: "3e17a08d-48a9-43c6-acd3-5bcc13df91df"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.718803 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e17a08d-48a9-43c6-acd3-5bcc13df91df" (UID: "3e17a08d-48a9-43c6-acd3-5bcc13df91df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.733395 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.733586 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7dx9\" (UniqueName: \"kubernetes.io/projected/3e17a08d-48a9-43c6-acd3-5bcc13df91df-kube-api-access-w7dx9\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.733676 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.733738 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.733803 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e17a08d-48a9-43c6-acd3-5bcc13df91df-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.733860 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.802311 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-config-data" (OuterVolumeSpecName: "config-data") pod "3e17a08d-48a9-43c6-acd3-5bcc13df91df" (UID: "3e17a08d-48a9-43c6-acd3-5bcc13df91df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.835754 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e17a08d-48a9-43c6-acd3-5bcc13df91df-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.942439 4722 generic.go:334] "Generic (PLEG): container finished" podID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerID="94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597" exitCode=0 Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.942593 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e17a08d-48a9-43c6-acd3-5bcc13df91df","Type":"ContainerDied","Data":"94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597"} Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.944084 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e17a08d-48a9-43c6-acd3-5bcc13df91df","Type":"ContainerDied","Data":"d27549932ec42cb5113394ad965b534e33b29d9f16e5cedbb8c7ea91e1576243"} Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.944182 4722 scope.go:117] "RemoveContainer" containerID="d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.942673 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.979704 4722 scope.go:117] "RemoveContainer" containerID="974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0" Feb 19 19:40:26 crc kubenswrapper[4722]: I0219 19:40:26.989016 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.015256 4722 scope.go:117] "RemoveContainer" containerID="94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.060989 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.061360 4722 scope.go:117] "RemoveContainer" containerID="9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.066234 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:27 crc kubenswrapper[4722]: E0219 19:40:27.066637 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="proxy-httpd" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.066660 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="proxy-httpd" Feb 19 19:40:27 crc kubenswrapper[4722]: E0219 19:40:27.066677 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="ceilometer-notification-agent" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.066686 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="ceilometer-notification-agent" Feb 19 19:40:27 crc kubenswrapper[4722]: E0219 19:40:27.066704 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="ceilometer-central-agent" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.066711 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="ceilometer-central-agent" Feb 19 19:40:27 crc kubenswrapper[4722]: E0219 19:40:27.066719 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="sg-core" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.066725 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="sg-core" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.066918 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="proxy-httpd" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.066938 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="ceilometer-central-agent" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.066961 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="ceilometer-notification-agent" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.066969 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" containerName="sg-core" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.068803 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.070676 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.071985 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.085162 4722 scope.go:117] "RemoveContainer" containerID="d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.085583 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e17a08d-48a9-43c6-acd3-5bcc13df91df" path="/var/lib/kubelet/pods/3e17a08d-48a9-43c6-acd3-5bcc13df91df/volumes" Feb 19 19:40:27 crc kubenswrapper[4722]: E0219 19:40:27.086001 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4\": container with ID starting with d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4 not found: ID does not exist" containerID="d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.086097 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4"} err="failed to get container status \"d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4\": rpc error: code = NotFound desc = could not find container \"d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4\": container with ID starting with d768c1cd305418a1538753565d1ed5ff515b4032742365e96ef534a94ff9d3f4 not found: ID does not exist" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.088514 4722 scope.go:117] "RemoveContainer" containerID="974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.086324 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:27 crc kubenswrapper[4722]: E0219 19:40:27.089537 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0\": container with ID starting with 974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0 not found: ID does not exist" containerID="974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.089649 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0"} err="failed to get container status \"974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0\": rpc error: code = NotFound desc = could not find container \"974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0\": container with ID starting with 974864e13a66ccdb7cd73dce522f0368a73fe6912130166624d3ba6b189049c0 not found: ID does not exist" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.089747 4722 scope.go:117] "RemoveContainer" containerID="94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597" Feb 19 19:40:27 crc kubenswrapper[4722]: E0219 19:40:27.090434 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597\": container with ID starting with 94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597 not found: ID does not exist" containerID="94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.090480 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597"} err="failed to get container status \"94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597\": rpc error: code = NotFound desc = could not find container \"94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597\": container with ID starting with 94c88da811e0d6fda10cdaee1492353a2fbd0fbd830a5a1bed28de5566347597 not found: ID does not exist" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.090508 4722 scope.go:117] "RemoveContainer" containerID="9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84" Feb 19 19:40:27 crc kubenswrapper[4722]: E0219 19:40:27.090772 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84\": container with ID starting with 9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84 not found: ID does not exist" containerID="9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.090853 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84"} err="failed to get container status \"9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84\": rpc error: code = NotFound desc = could not find container \"9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84\": container with ID starting with 9268b0174bf0d2e4577bded7d4200e65b8852a101d1c932c9153168ff567ab84 not found: ID does not exist" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.168341 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hgsn\" (UniqueName: \"kubernetes.io/projected/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-kube-api-access-5hgsn\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.168569 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.168706 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-scripts\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.168797 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-log-httpd\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.168892 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-run-httpd\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.169114 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.169191 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-config-data\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.271206 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-run-httpd\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.271317 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.271350 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-config-data\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.271474 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hgsn\" (UniqueName: \"kubernetes.io/projected/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-kube-api-access-5hgsn\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.271513 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.271536 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-scripts\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.271575 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-log-httpd\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.272039 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-run-httpd\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.272081 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-log-httpd\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.277172 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.277199 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.278541 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-config-data\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.279015 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-scripts\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.292678 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hgsn\" (UniqueName: \"kubernetes.io/projected/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-kube-api-access-5hgsn\") pod \"ceilometer-0\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.385690 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:40:27 crc kubenswrapper[4722]: W0219 19:40:27.830002 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfd4ffd8_1f63_4881_9774_9dda64b8ae5c.slice/crio-0b578e9539c84fc7d8484077c8a6c06daff6b530c3c3b8f5fb43051dd535101b WatchSource:0}: Error finding container 0b578e9539c84fc7d8484077c8a6c06daff6b530c3c3b8f5fb43051dd535101b: Status 404 returned error can't find the container with id 0b578e9539c84fc7d8484077c8a6c06daff6b530c3c3b8f5fb43051dd535101b Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.833063 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:40:27 crc kubenswrapper[4722]: I0219 19:40:27.956085 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c","Type":"ContainerStarted","Data":"0b578e9539c84fc7d8484077c8a6c06daff6b530c3c3b8f5fb43051dd535101b"} Feb 19 19:40:28 crc kubenswrapper[4722]: I0219 19:40:28.969005 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c","Type":"ContainerStarted","Data":"89555d2cfdc64982f801988ef2297fc9a4c1bb04fb28bd06ae98ee1ecd56cd0a"} Feb 19 19:40:30 crc kubenswrapper[4722]: I0219 19:40:30.996708 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c","Type":"ContainerStarted","Data":"88727b26de3648e330b8018601cf86477430e3aed456e602920db3b0c636f193"} Feb 19 19:40:32 crc kubenswrapper[4722]: I0219 19:40:32.010032 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c","Type":"ContainerStarted","Data":"41cf9b6a09bc0c1ee74ae82fe251fd733a4e1f343ef37f5f87cd6dd3a0f419e1"} Feb 19 19:40:33 crc kubenswrapper[4722]: I0219 19:40:33.026661 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c","Type":"ContainerStarted","Data":"c936b7b26f2e71263c106589ce3856aa7fd7d2a5e0f20bb894ab7d5bae77b099"} Feb 19 19:40:33 crc kubenswrapper[4722]: I0219 19:40:33.028064 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 19:40:33 crc kubenswrapper[4722]: I0219 19:40:33.068685 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.15073089 podStartE2EDuration="7.068657055s" podCreationTimestamp="2026-02-19 19:40:26 +0000 UTC" firstStartedPulling="2026-02-19 19:40:27.834833591 +0000 UTC m=+1327.447183925" lastFinishedPulling="2026-02-19 19:40:32.752759766 +0000 UTC m=+1332.365110090" observedRunningTime="2026-02-19 19:40:33.046674611 +0000 UTC m=+1332.659024975" watchObservedRunningTime="2026-02-19 19:40:33.068657055 +0000 UTC m=+1332.681007419" Feb 19 19:40:36 crc kubenswrapper[4722]: I0219 19:40:36.061598 4722 generic.go:334] "Generic (PLEG): container finished" podID="e2859f56-714b-43b5-bb67-6ee5493d4f11" containerID="0edcf275740c511d92faf25dcc6aa827af0e172da4743fa7292bb01babbbeb7e" exitCode=0 Feb 19 19:40:36 crc kubenswrapper[4722]: I0219 19:40:36.062004 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cddxh" event={"ID":"e2859f56-714b-43b5-bb67-6ee5493d4f11","Type":"ContainerDied","Data":"0edcf275740c511d92faf25dcc6aa827af0e172da4743fa7292bb01babbbeb7e"} Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.499745 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.609783 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-scripts\") pod \"e2859f56-714b-43b5-bb67-6ee5493d4f11\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.610086 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-combined-ca-bundle\") pod \"e2859f56-714b-43b5-bb67-6ee5493d4f11\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.610178 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj9lg\" (UniqueName: \"kubernetes.io/projected/e2859f56-714b-43b5-bb67-6ee5493d4f11-kube-api-access-jj9lg\") pod \"e2859f56-714b-43b5-bb67-6ee5493d4f11\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.610221 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-config-data\") pod \"e2859f56-714b-43b5-bb67-6ee5493d4f11\" (UID: \"e2859f56-714b-43b5-bb67-6ee5493d4f11\") " Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.615136 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-scripts" (OuterVolumeSpecName: "scripts") pod "e2859f56-714b-43b5-bb67-6ee5493d4f11" (UID: "e2859f56-714b-43b5-bb67-6ee5493d4f11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.615965 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2859f56-714b-43b5-bb67-6ee5493d4f11-kube-api-access-jj9lg" (OuterVolumeSpecName: "kube-api-access-jj9lg") pod "e2859f56-714b-43b5-bb67-6ee5493d4f11" (UID: "e2859f56-714b-43b5-bb67-6ee5493d4f11"). InnerVolumeSpecName "kube-api-access-jj9lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.639720 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2859f56-714b-43b5-bb67-6ee5493d4f11" (UID: "e2859f56-714b-43b5-bb67-6ee5493d4f11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.642781 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-config-data" (OuterVolumeSpecName: "config-data") pod "e2859f56-714b-43b5-bb67-6ee5493d4f11" (UID: "e2859f56-714b-43b5-bb67-6ee5493d4f11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.712171 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.712390 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj9lg\" (UniqueName: \"kubernetes.io/projected/e2859f56-714b-43b5-bb67-6ee5493d4f11-kube-api-access-jj9lg\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.712477 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:37 crc kubenswrapper[4722]: I0219 19:40:37.712549 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2859f56-714b-43b5-bb67-6ee5493d4f11-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.084000 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cddxh" event={"ID":"e2859f56-714b-43b5-bb67-6ee5493d4f11","Type":"ContainerDied","Data":"6c95896fdbaccccb05ee2975e96d928a878ac228bdcc2d7bc3031a24ff6279cb"} Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.084251 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c95896fdbaccccb05ee2975e96d928a878ac228bdcc2d7bc3031a24ff6279cb" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.084138 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cddxh" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.196816 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 19:40:38 crc kubenswrapper[4722]: E0219 19:40:38.197228 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2859f56-714b-43b5-bb67-6ee5493d4f11" containerName="nova-cell0-conductor-db-sync" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.197251 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2859f56-714b-43b5-bb67-6ee5493d4f11" containerName="nova-cell0-conductor-db-sync" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.197488 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2859f56-714b-43b5-bb67-6ee5493d4f11" containerName="nova-cell0-conductor-db-sync" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.198221 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.201520 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6spgl" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.202660 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.216895 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.323940 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f96c80-f951-453b-9880-ecd0591dc1bf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"69f96c80-f951-453b-9880-ecd0591dc1bf\") " pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.324045 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f96c80-f951-453b-9880-ecd0591dc1bf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"69f96c80-f951-453b-9880-ecd0591dc1bf\") " pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.324078 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2chqr\" (UniqueName: \"kubernetes.io/projected/69f96c80-f951-453b-9880-ecd0591dc1bf-kube-api-access-2chqr\") pod \"nova-cell0-conductor-0\" (UID: \"69f96c80-f951-453b-9880-ecd0591dc1bf\") " pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.426497 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f96c80-f951-453b-9880-ecd0591dc1bf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"69f96c80-f951-453b-9880-ecd0591dc1bf\") " pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.427562 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f96c80-f951-453b-9880-ecd0591dc1bf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"69f96c80-f951-453b-9880-ecd0591dc1bf\") " pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.427999 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2chqr\" (UniqueName: \"kubernetes.io/projected/69f96c80-f951-453b-9880-ecd0591dc1bf-kube-api-access-2chqr\") pod \"nova-cell0-conductor-0\" (UID: \"69f96c80-f951-453b-9880-ecd0591dc1bf\") " pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.433524 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69f96c80-f951-453b-9880-ecd0591dc1bf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"69f96c80-f951-453b-9880-ecd0591dc1bf\") " pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.438662 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69f96c80-f951-453b-9880-ecd0591dc1bf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"69f96c80-f951-453b-9880-ecd0591dc1bf\") " pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.446953 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2chqr\" (UniqueName: \"kubernetes.io/projected/69f96c80-f951-453b-9880-ecd0591dc1bf-kube-api-access-2chqr\") pod \"nova-cell0-conductor-0\" (UID: \"69f96c80-f951-453b-9880-ecd0591dc1bf\") " pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:38 crc kubenswrapper[4722]: I0219 19:40:38.522852 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:39 crc kubenswrapper[4722]: I0219 19:40:39.092708 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 19:40:39 crc kubenswrapper[4722]: W0219 19:40:39.094105 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69f96c80_f951_453b_9880_ecd0591dc1bf.slice/crio-20ff520f14e8cd3859bf6213c96aac7ba76ae4ab75f2b7006418591eba750815 WatchSource:0}: Error finding container 20ff520f14e8cd3859bf6213c96aac7ba76ae4ab75f2b7006418591eba750815: Status 404 returned error can't find the container with id 20ff520f14e8cd3859bf6213c96aac7ba76ae4ab75f2b7006418591eba750815 Feb 19 19:40:40 crc kubenswrapper[4722]: I0219 19:40:40.104613 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"69f96c80-f951-453b-9880-ecd0591dc1bf","Type":"ContainerStarted","Data":"538f2ff9d11fea42946598a8e0ff2a03260f9d88fac782cc737a7024d14f62df"} Feb 19 19:40:40 crc kubenswrapper[4722]: I0219 19:40:40.104960 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"69f96c80-f951-453b-9880-ecd0591dc1bf","Type":"ContainerStarted","Data":"20ff520f14e8cd3859bf6213c96aac7ba76ae4ab75f2b7006418591eba750815"} Feb 19 19:40:40 crc kubenswrapper[4722]: I0219 19:40:40.104981 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:40 crc kubenswrapper[4722]: I0219 19:40:40.132087 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.132061939 podStartE2EDuration="2.132061939s" podCreationTimestamp="2026-02-19 19:40:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:40:40.120792088 +0000 UTC m=+1339.733142422" watchObservedRunningTime="2026-02-19 19:40:40.132061939 +0000 UTC m=+1339.744412293" Feb 19 19:40:48 crc kubenswrapper[4722]: I0219 19:40:48.563456 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.091566 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-dzq9w"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.093144 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.096135 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.096714 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.110263 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-dzq9w"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.253034 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mfl2\" (UniqueName: \"kubernetes.io/projected/d1a230c6-6844-4483-a8b4-0ae8073dff8d-kube-api-access-5mfl2\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.253187 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-scripts\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.253221 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-config-data\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.253257 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.262701 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.264195 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.266092 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.278313 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.354922 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mfl2\" (UniqueName: \"kubernetes.io/projected/d1a230c6-6844-4483-a8b4-0ae8073dff8d-kube-api-access-5mfl2\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.355044 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-scripts\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.355080 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-config-data\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.355120 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.366342 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-config-data\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.367046 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.368884 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.373888 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.375364 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.383793 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mfl2\" (UniqueName: \"kubernetes.io/projected/d1a230c6-6844-4483-a8b4-0ae8073dff8d-kube-api-access-5mfl2\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.385652 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.387677 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-scripts\") pod \"nova-cell0-cell-mapping-dzq9w\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.451351 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.470574 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42f98\" (UniqueName: \"kubernetes.io/projected/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-kube-api-access-42f98\") pod \"nova-cell1-novncproxy-0\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.470645 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc6h5\" (UniqueName: \"kubernetes.io/projected/a47ca62a-2546-47ec-80f7-1aa7e739e43e-kube-api-access-qc6h5\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.470703 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.470754 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.471005 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a47ca62a-2546-47ec-80f7-1aa7e739e43e-logs\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.471050 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.471106 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-config-data\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.485216 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.486909 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.491988 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.509340 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.532283 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.533667 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.546780 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.597835 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.597927 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.598207 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a47ca62a-2546-47ec-80f7-1aa7e739e43e-logs\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.598242 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.598293 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-config-data\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.598442 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42f98\" (UniqueName: \"kubernetes.io/projected/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-kube-api-access-42f98\") pod \"nova-cell1-novncproxy-0\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.598475 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc6h5\" (UniqueName: \"kubernetes.io/projected/a47ca62a-2546-47ec-80f7-1aa7e739e43e-kube-api-access-qc6h5\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.602961 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.603437 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a47ca62a-2546-47ec-80f7-1aa7e739e43e-logs\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.627443 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.632862 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc6h5\" (UniqueName: \"kubernetes.io/projected/a47ca62a-2546-47ec-80f7-1aa7e739e43e-kube-api-access-qc6h5\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.635040 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.641343 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.651482 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42f98\" (UniqueName: \"kubernetes.io/projected/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-kube-api-access-42f98\") pod \"nova-cell1-novncproxy-0\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.655236 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-config-data\") pod \"nova-api-0\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.686377 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cd565959-5cmk8"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.688346 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.705022 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-5cmk8"] Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.707690 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " pod="openstack/nova-scheduler-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.707797 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-config-data\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.707879 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.707938 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7fwn\" (UniqueName: \"kubernetes.io/projected/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-kube-api-access-m7fwn\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.707986 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lfmc\" (UniqueName: \"kubernetes.io/projected/0b7f5812-df88-4652-85af-75b6b7f994ee-kube-api-access-9lfmc\") pod \"nova-scheduler-0\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " pod="openstack/nova-scheduler-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.708221 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-logs\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.708263 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-config-data\") pod \"nova-scheduler-0\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " pod="openstack/nova-scheduler-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.773669 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.810552 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-config-data\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.810634 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.810672 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-svc\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.810710 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7fwn\" (UniqueName: \"kubernetes.io/projected/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-kube-api-access-m7fwn\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.810727 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dvwm\" (UniqueName: \"kubernetes.io/projected/5e629ce1-0108-4450-bb62-44ca1d2993b6-kube-api-access-5dvwm\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.810758 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lfmc\" (UniqueName: \"kubernetes.io/projected/0b7f5812-df88-4652-85af-75b6b7f994ee-kube-api-access-9lfmc\") pod \"nova-scheduler-0\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " pod="openstack/nova-scheduler-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.810775 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.810794 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-config\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.810823 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-logs\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.811055 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-config-data\") pod \"nova-scheduler-0\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " pod="openstack/nova-scheduler-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.811082 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.811118 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.811139 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " pod="openstack/nova-scheduler-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.812832 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-logs\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.818598 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-config-data\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.821651 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.827108 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-config-data\") pod \"nova-scheduler-0\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " pod="openstack/nova-scheduler-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.831574 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " pod="openstack/nova-scheduler-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.836277 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lfmc\" (UniqueName: \"kubernetes.io/projected/0b7f5812-df88-4652-85af-75b6b7f994ee-kube-api-access-9lfmc\") pod \"nova-scheduler-0\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " pod="openstack/nova-scheduler-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.844538 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7fwn\" (UniqueName: \"kubernetes.io/projected/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-kube-api-access-m7fwn\") pod \"nova-metadata-0\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " pod="openstack/nova-metadata-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.883834 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.923204 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-svc\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.923269 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dvwm\" (UniqueName: \"kubernetes.io/projected/5e629ce1-0108-4450-bb62-44ca1d2993b6-kube-api-access-5dvwm\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.923320 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.923344 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-config\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.923391 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.923437 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.924509 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.924813 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.928772 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-svc\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.933717 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.934304 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-config\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.967119 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dvwm\" (UniqueName: \"kubernetes.io/projected/5e629ce1-0108-4450-bb62-44ca1d2993b6-kube-api-access-5dvwm\") pod \"dnsmasq-dns-78cd565959-5cmk8\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:49 crc kubenswrapper[4722]: I0219 19:40:49.985445 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:40:50 crc kubenswrapper[4722]: I0219 19:40:50.005613 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 19:40:50 crc kubenswrapper[4722]: I0219 19:40:50.025677 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:50 crc kubenswrapper[4722]: I0219 19:40:50.153767 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-dzq9w"] Feb 19 19:40:50 crc kubenswrapper[4722]: W0219 19:40:50.259254 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1a230c6_6844_4483_a8b4_0ae8073dff8d.slice/crio-bdba7561ab6578109f42cf06a77bcff40e40ca9d14ad2f78b3e4e5201c687f05 WatchSource:0}: Error finding container bdba7561ab6578109f42cf06a77bcff40e40ca9d14ad2f78b3e4e5201c687f05: Status 404 returned error can't find the container with id bdba7561ab6578109f42cf06a77bcff40e40ca9d14ad2f78b3e4e5201c687f05 Feb 19 19:40:50 crc kubenswrapper[4722]: I0219 19:40:50.498612 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:40:50 crc kubenswrapper[4722]: I0219 19:40:50.720514 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:40:50 crc kubenswrapper[4722]: I0219 19:40:50.836266 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 19:40:50 crc kubenswrapper[4722]: W0219 19:40:50.836759 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb663fb91_fb60_451c_a9c9_7278dbd1c9ac.slice/crio-f4eb8507bf9bd4ba56ef8e8c9e8c4b4aef6c178119ed37272deca3c919973d29 WatchSource:0}: Error finding container f4eb8507bf9bd4ba56ef8e8c9e8c4b4aef6c178119ed37272deca3c919973d29: Status 404 returned error can't find the container with id f4eb8507bf9bd4ba56ef8e8c9e8c4b4aef6c178119ed37272deca3c919973d29 Feb 19 19:40:50 crc kubenswrapper[4722]: I0219 19:40:50.851811 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:40:50 crc kubenswrapper[4722]: W0219 19:40:50.855076 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda14e3a56_49b6_4bc5_81f3_2e8b1da839b8.slice/crio-e99e3c51f8eaf9cc66224cb3dcf044208d804f23037f4fb0e6f60687011d3ec9 WatchSource:0}: Error finding container e99e3c51f8eaf9cc66224cb3dcf044208d804f23037f4fb0e6f60687011d3ec9: Status 404 returned error can't find the container with id e99e3c51f8eaf9cc66224cb3dcf044208d804f23037f4fb0e6f60687011d3ec9 Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.007140 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2nnkf"] Feb 19 19:40:51 crc kubenswrapper[4722]: W0219 19:40:51.007589 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e629ce1_0108_4450_bb62_44ca1d2993b6.slice/crio-665dc9cdb3191f51c3a767aeef3d279162c1c7c6e48e1f117eb053fc7bdf1b06 WatchSource:0}: Error finding container 665dc9cdb3191f51c3a767aeef3d279162c1c7c6e48e1f117eb053fc7bdf1b06: Status 404 returned error can't find the container with id 665dc9cdb3191f51c3a767aeef3d279162c1c7c6e48e1f117eb053fc7bdf1b06 Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.008654 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.010748 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.010996 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.023595 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-5cmk8"] Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.040622 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2nnkf"] Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.057002 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6p5b\" (UniqueName: \"kubernetes.io/projected/106da00f-55de-4b4f-8a57-b8f0b1994c2f-kube-api-access-q6p5b\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.057069 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.057132 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-config-data\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.058330 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-scripts\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.159985 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-config-data\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.160076 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-scripts\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.160178 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6p5b\" (UniqueName: \"kubernetes.io/projected/106da00f-55de-4b4f-8a57-b8f0b1994c2f-kube-api-access-q6p5b\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.160224 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.164288 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-config-data\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.164288 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-scripts\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.170666 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.194938 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6p5b\" (UniqueName: \"kubernetes.io/projected/106da00f-55de-4b4f-8a57-b8f0b1994c2f-kube-api-access-q6p5b\") pod \"nova-cell1-conductor-db-sync-2nnkf\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.252637 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b663fb91-fb60-451c-a9c9-7278dbd1c9ac","Type":"ContainerStarted","Data":"f4eb8507bf9bd4ba56ef8e8c9e8c4b4aef6c178119ed37272deca3c919973d29"} Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.263683 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0b7f5812-df88-4652-85af-75b6b7f994ee","Type":"ContainerStarted","Data":"3e855c266781287503b5b752c8ff71d55312447fd33f6883decf51bbec4b4e45"} Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.278335 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" event={"ID":"5e629ce1-0108-4450-bb62-44ca1d2993b6","Type":"ContainerStarted","Data":"665dc9cdb3191f51c3a767aeef3d279162c1c7c6e48e1f117eb053fc7bdf1b06"} Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.283678 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8","Type":"ContainerStarted","Data":"e99e3c51f8eaf9cc66224cb3dcf044208d804f23037f4fb0e6f60687011d3ec9"} Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.286008 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a47ca62a-2546-47ec-80f7-1aa7e739e43e","Type":"ContainerStarted","Data":"e752336bd0fae94c244c48636db8246622b4108d65cd7eef8e5be29938427e5c"} Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.287843 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dzq9w" event={"ID":"d1a230c6-6844-4483-a8b4-0ae8073dff8d","Type":"ContainerStarted","Data":"60ef90f5731ef12dc8b60fe6497ea601e5709b737be4d080a2debe1569284fd1"} Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.287890 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dzq9w" event={"ID":"d1a230c6-6844-4483-a8b4-0ae8073dff8d","Type":"ContainerStarted","Data":"bdba7561ab6578109f42cf06a77bcff40e40ca9d14ad2f78b3e4e5201c687f05"} Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.317858 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-dzq9w" podStartSLOduration=2.317837374 podStartE2EDuration="2.317837374s" podCreationTimestamp="2026-02-19 19:40:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:40:51.305621103 +0000 UTC m=+1350.917971437" watchObservedRunningTime="2026-02-19 19:40:51.317837374 +0000 UTC m=+1350.930187698" Feb 19 19:40:51 crc kubenswrapper[4722]: I0219 19:40:51.328015 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:40:52 crc kubenswrapper[4722]: I0219 19:40:52.089904 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2nnkf"] Feb 19 19:40:52 crc kubenswrapper[4722]: I0219 19:40:52.337448 4722 generic.go:334] "Generic (PLEG): container finished" podID="5e629ce1-0108-4450-bb62-44ca1d2993b6" containerID="3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b" exitCode=0 Feb 19 19:40:52 crc kubenswrapper[4722]: I0219 19:40:52.337779 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" event={"ID":"5e629ce1-0108-4450-bb62-44ca1d2993b6","Type":"ContainerDied","Data":"3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b"} Feb 19 19:40:52 crc kubenswrapper[4722]: I0219 19:40:52.352630 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2nnkf" event={"ID":"106da00f-55de-4b4f-8a57-b8f0b1994c2f","Type":"ContainerStarted","Data":"facd2fa08899d3f5f781d5d5581a5b5e4e0814ddf689e925edac6d9674d3eba2"} Feb 19 19:40:52 crc kubenswrapper[4722]: I0219 19:40:52.986576 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 19:40:53 crc kubenswrapper[4722]: I0219 19:40:53.003242 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:40:53 crc kubenswrapper[4722]: I0219 19:40:53.365485 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2nnkf" event={"ID":"106da00f-55de-4b4f-8a57-b8f0b1994c2f","Type":"ContainerStarted","Data":"b21bbae7a8949776700019b16cbaccbd427e9d7db723a1e91246a4178885c340"} Feb 19 19:40:53 crc kubenswrapper[4722]: I0219 19:40:53.368437 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" event={"ID":"5e629ce1-0108-4450-bb62-44ca1d2993b6","Type":"ContainerStarted","Data":"42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e"} Feb 19 19:40:53 crc kubenswrapper[4722]: I0219 19:40:53.369412 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:40:53 crc kubenswrapper[4722]: I0219 19:40:53.392377 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-2nnkf" podStartSLOduration=3.392362034 podStartE2EDuration="3.392362034s" podCreationTimestamp="2026-02-19 19:40:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:40:53.381105594 +0000 UTC m=+1352.993455928" watchObservedRunningTime="2026-02-19 19:40:53.392362034 +0000 UTC m=+1353.004712348" Feb 19 19:40:53 crc kubenswrapper[4722]: I0219 19:40:53.418536 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" podStartSLOduration=4.418516117 podStartE2EDuration="4.418516117s" podCreationTimestamp="2026-02-19 19:40:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:40:53.410349593 +0000 UTC m=+1353.022699907" watchObservedRunningTime="2026-02-19 19:40:53.418516117 +0000 UTC m=+1353.030866441" Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.405298 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0b7f5812-df88-4652-85af-75b6b7f994ee","Type":"ContainerStarted","Data":"87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4"} Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.407386 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8","Type":"ContainerStarted","Data":"9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283"} Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.407421 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8","Type":"ContainerStarted","Data":"b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655"} Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.407483 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" containerName="nova-metadata-log" containerID="cri-o://b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655" gracePeriod=30 Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.407510 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" containerName="nova-metadata-metadata" containerID="cri-o://9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283" gracePeriod=30 Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.409791 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a47ca62a-2546-47ec-80f7-1aa7e739e43e","Type":"ContainerStarted","Data":"e1203e3353e1b22d14cf15e5511afb0b51de1a779175f10f5d565c0c112db8ec"} Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.409833 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a47ca62a-2546-47ec-80f7-1aa7e739e43e","Type":"ContainerStarted","Data":"33c264ae3ae6e4eeb0fddc45a932c5dedfb68e7bb87b529cf2bce1cde21556b3"} Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.414845 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b663fb91-fb60-451c-a9c9-7278dbd1c9ac","Type":"ContainerStarted","Data":"346ae374bf887f315658e5888cdaaef27ec7de0b0320851ac3b6d0f93d5058e0"} Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.414958 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b663fb91-fb60-451c-a9c9-7278dbd1c9ac" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://346ae374bf887f315658e5888cdaaef27ec7de0b0320851ac3b6d0f93d5058e0" gracePeriod=30 Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.433326 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.92319745 podStartE2EDuration="7.433306926s" podCreationTimestamp="2026-02-19 19:40:49 +0000 UTC" firstStartedPulling="2026-02-19 19:40:50.717273626 +0000 UTC m=+1350.329623950" lastFinishedPulling="2026-02-19 19:40:55.227383102 +0000 UTC m=+1354.839733426" observedRunningTime="2026-02-19 19:40:56.432660256 +0000 UTC m=+1356.045010600" watchObservedRunningTime="2026-02-19 19:40:56.433306926 +0000 UTC m=+1356.045657260" Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.453847 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.064392233 podStartE2EDuration="7.453826244s" podCreationTimestamp="2026-02-19 19:40:49 +0000 UTC" firstStartedPulling="2026-02-19 19:40:50.838621352 +0000 UTC m=+1350.450971676" lastFinishedPulling="2026-02-19 19:40:55.228055323 +0000 UTC m=+1354.840405687" observedRunningTime="2026-02-19 19:40:56.45303178 +0000 UTC m=+1356.065382104" watchObservedRunningTime="2026-02-19 19:40:56.453826244 +0000 UTC m=+1356.066176568" Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.486235 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.770788789 podStartE2EDuration="7.486217362s" podCreationTimestamp="2026-02-19 19:40:49 +0000 UTC" firstStartedPulling="2026-02-19 19:40:50.511941558 +0000 UTC m=+1350.124291892" lastFinishedPulling="2026-02-19 19:40:55.227370151 +0000 UTC m=+1354.839720465" observedRunningTime="2026-02-19 19:40:56.476372705 +0000 UTC m=+1356.088723039" watchObservedRunningTime="2026-02-19 19:40:56.486217362 +0000 UTC m=+1356.098567686" Feb 19 19:40:56 crc kubenswrapper[4722]: I0219 19:40:56.511656 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.1377026040000002 podStartE2EDuration="7.511602612s" podCreationTimestamp="2026-02-19 19:40:49 +0000 UTC" firstStartedPulling="2026-02-19 19:40:50.857274472 +0000 UTC m=+1350.469624796" lastFinishedPulling="2026-02-19 19:40:55.23117448 +0000 UTC m=+1354.843524804" observedRunningTime="2026-02-19 19:40:56.505025717 +0000 UTC m=+1356.117376031" watchObservedRunningTime="2026-02-19 19:40:56.511602612 +0000 UTC m=+1356.123952936" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.237780 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.394979 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.422019 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-logs\") pod \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.422341 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7fwn\" (UniqueName: \"kubernetes.io/projected/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-kube-api-access-m7fwn\") pod \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.422465 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-combined-ca-bundle\") pod \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.422551 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-config-data\") pod \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\" (UID: \"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8\") " Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.426417 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-logs" (OuterVolumeSpecName: "logs") pod "a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" (UID: "a14e3a56-49b6-4bc5-81f3-2e8b1da839b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.471759 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-kube-api-access-m7fwn" (OuterVolumeSpecName: "kube-api-access-m7fwn") pod "a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" (UID: "a14e3a56-49b6-4bc5-81f3-2e8b1da839b8"). InnerVolumeSpecName "kube-api-access-m7fwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.473390 4722 generic.go:334] "Generic (PLEG): container finished" podID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" containerID="9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283" exitCode=0 Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.473422 4722 generic.go:334] "Generic (PLEG): container finished" podID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" containerID="b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655" exitCode=143 Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.474770 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.474843 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8","Type":"ContainerDied","Data":"9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283"} Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.474872 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8","Type":"ContainerDied","Data":"b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655"} Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.474882 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a14e3a56-49b6-4bc5-81f3-2e8b1da839b8","Type":"ContainerDied","Data":"e99e3c51f8eaf9cc66224cb3dcf044208d804f23037f4fb0e6f60687011d3ec9"} Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.474899 4722 scope.go:117] "RemoveContainer" containerID="9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.477308 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" (UID: "a14e3a56-49b6-4bc5-81f3-2e8b1da839b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.488323 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-config-data" (OuterVolumeSpecName: "config-data") pod "a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" (UID: "a14e3a56-49b6-4bc5-81f3-2e8b1da839b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.526546 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7fwn\" (UniqueName: \"kubernetes.io/projected/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-kube-api-access-m7fwn\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.526584 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.526598 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.526610 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.659016 4722 scope.go:117] "RemoveContainer" containerID="b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.703384 4722 scope.go:117] "RemoveContainer" containerID="9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283" Feb 19 19:40:57 crc kubenswrapper[4722]: E0219 19:40:57.704106 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283\": container with ID starting with 9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283 not found: ID does not exist" containerID="9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.704143 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283"} err="failed to get container status \"9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283\": rpc error: code = NotFound desc = could not find container \"9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283\": container with ID starting with 9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283 not found: ID does not exist" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.704183 4722 scope.go:117] "RemoveContainer" containerID="b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655" Feb 19 19:40:57 crc kubenswrapper[4722]: E0219 19:40:57.704539 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655\": container with ID starting with b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655 not found: ID does not exist" containerID="b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.704558 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655"} err="failed to get container status \"b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655\": rpc error: code = NotFound desc = could not find container \"b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655\": container with ID starting with b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655 not found: ID does not exist" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.704571 4722 scope.go:117] "RemoveContainer" containerID="9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.704758 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283"} err="failed to get container status \"9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283\": rpc error: code = NotFound desc = could not find container \"9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283\": container with ID starting with 9754875a56daa65645517e9adb9079b2958b26cb0a261036bbb32e6dd92a9283 not found: ID does not exist" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.704776 4722 scope.go:117] "RemoveContainer" containerID="b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.704926 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655"} err="failed to get container status \"b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655\": rpc error: code = NotFound desc = could not find container \"b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655\": container with ID starting with b8d0af0a95d5f664bf6913cbf70a88800b7b865bd00df464a198e11f6d1a4655 not found: ID does not exist" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.817207 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.835723 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.852290 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:40:57 crc kubenswrapper[4722]: E0219 19:40:57.852703 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" containerName="nova-metadata-log" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.852719 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" containerName="nova-metadata-log" Feb 19 19:40:57 crc kubenswrapper[4722]: E0219 19:40:57.852751 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" containerName="nova-metadata-metadata" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.852758 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" containerName="nova-metadata-metadata" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.852948 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" containerName="nova-metadata-metadata" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.852959 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" containerName="nova-metadata-log" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.856657 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.860023 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.860258 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 19:40:57 crc kubenswrapper[4722]: I0219 19:40:57.873050 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.039322 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-config-data\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.039702 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.039763 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3ea5e74-4865-4550-bf03-3214021a9cda-logs\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.039869 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l498t\" (UniqueName: \"kubernetes.io/projected/e3ea5e74-4865-4550-bf03-3214021a9cda-kube-api-access-l498t\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.039922 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.142107 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-config-data\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.142249 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.142340 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3ea5e74-4865-4550-bf03-3214021a9cda-logs\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.142427 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l498t\" (UniqueName: \"kubernetes.io/projected/e3ea5e74-4865-4550-bf03-3214021a9cda-kube-api-access-l498t\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.142483 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.142964 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3ea5e74-4865-4550-bf03-3214021a9cda-logs\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.145901 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-config-data\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.146197 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.148346 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.161269 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l498t\" (UniqueName: \"kubernetes.io/projected/e3ea5e74-4865-4550-bf03-3214021a9cda-kube-api-access-l498t\") pod \"nova-metadata-0\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.188731 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:40:58 crc kubenswrapper[4722]: W0219 19:40:58.674977 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3ea5e74_4865_4550_bf03_3214021a9cda.slice/crio-2bf1295d63e927ccd4e4559a63af3638bc9a37a9047410b86f7de1d3bfe4945d WatchSource:0}: Error finding container 2bf1295d63e927ccd4e4559a63af3638bc9a37a9047410b86f7de1d3bfe4945d: Status 404 returned error can't find the container with id 2bf1295d63e927ccd4e4559a63af3638bc9a37a9047410b86f7de1d3bfe4945d Feb 19 19:40:58 crc kubenswrapper[4722]: I0219 19:40:58.680183 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:40:59 crc kubenswrapper[4722]: I0219 19:40:59.081996 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a14e3a56-49b6-4bc5-81f3-2e8b1da839b8" path="/var/lib/kubelet/pods/a14e3a56-49b6-4bc5-81f3-2e8b1da839b8/volumes" Feb 19 19:40:59 crc kubenswrapper[4722]: I0219 19:40:59.502120 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3ea5e74-4865-4550-bf03-3214021a9cda","Type":"ContainerStarted","Data":"808417d00e6b28c9c6aa522c1977a475326ba92e3cf741e98e583db7b0e9115f"} Feb 19 19:40:59 crc kubenswrapper[4722]: I0219 19:40:59.502193 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3ea5e74-4865-4550-bf03-3214021a9cda","Type":"ContainerStarted","Data":"24e4af34fca0eb266fb1f08b79e4f8ece3a7aaae432851469c316582f3b8b813"} Feb 19 19:40:59 crc kubenswrapper[4722]: I0219 19:40:59.502207 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3ea5e74-4865-4550-bf03-3214021a9cda","Type":"ContainerStarted","Data":"2bf1295d63e927ccd4e4559a63af3638bc9a37a9047410b86f7de1d3bfe4945d"} Feb 19 19:40:59 crc kubenswrapper[4722]: I0219 19:40:59.504536 4722 generic.go:334] "Generic (PLEG): container finished" podID="d1a230c6-6844-4483-a8b4-0ae8073dff8d" containerID="60ef90f5731ef12dc8b60fe6497ea601e5709b737be4d080a2debe1569284fd1" exitCode=0 Feb 19 19:40:59 crc kubenswrapper[4722]: I0219 19:40:59.504630 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dzq9w" event={"ID":"d1a230c6-6844-4483-a8b4-0ae8073dff8d","Type":"ContainerDied","Data":"60ef90f5731ef12dc8b60fe6497ea601e5709b737be4d080a2debe1569284fd1"} Feb 19 19:40:59 crc kubenswrapper[4722]: I0219 19:40:59.523777 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.523760508 podStartE2EDuration="2.523760508s" podCreationTimestamp="2026-02-19 19:40:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:40:59.516884063 +0000 UTC m=+1359.129234387" watchObservedRunningTime="2026-02-19 19:40:59.523760508 +0000 UTC m=+1359.136110832" Feb 19 19:40:59 crc kubenswrapper[4722]: I0219 19:40:59.774665 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 19:40:59 crc kubenswrapper[4722]: I0219 19:40:59.775368 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 19:40:59 crc kubenswrapper[4722]: I0219 19:40:59.884648 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.006820 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.006893 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.028384 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.058899 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.103126 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-2g6g8"] Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.103649 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" podUID="8f530e65-8397-49d6-929a-201bb5dfe585" containerName="dnsmasq-dns" containerID="cri-o://b0f785695269b6ae9fc48dfba62c1a732aa42aadccca7a02f2d798ea3429fbac" gracePeriod=10 Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.516581 4722 generic.go:334] "Generic (PLEG): container finished" podID="8f530e65-8397-49d6-929a-201bb5dfe585" containerID="b0f785695269b6ae9fc48dfba62c1a732aa42aadccca7a02f2d798ea3429fbac" exitCode=0 Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.516783 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" event={"ID":"8f530e65-8397-49d6-929a-201bb5dfe585","Type":"ContainerDied","Data":"b0f785695269b6ae9fc48dfba62c1a732aa42aadccca7a02f2d798ea3429fbac"} Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.580495 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.858583 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.211:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 19:41:00 crc kubenswrapper[4722]: I0219 19:41:00.859096 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.211:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.108296 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.205777 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-scripts\") pod \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.205834 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mfl2\" (UniqueName: \"kubernetes.io/projected/d1a230c6-6844-4483-a8b4-0ae8073dff8d-kube-api-access-5mfl2\") pod \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.205867 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-combined-ca-bundle\") pod \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.206022 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-config-data\") pod \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\" (UID: \"d1a230c6-6844-4483-a8b4-0ae8073dff8d\") " Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.213460 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1a230c6-6844-4483-a8b4-0ae8073dff8d-kube-api-access-5mfl2" (OuterVolumeSpecName: "kube-api-access-5mfl2") pod "d1a230c6-6844-4483-a8b4-0ae8073dff8d" (UID: "d1a230c6-6844-4483-a8b4-0ae8073dff8d"). InnerVolumeSpecName "kube-api-access-5mfl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.240676 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-scripts" (OuterVolumeSpecName: "scripts") pod "d1a230c6-6844-4483-a8b4-0ae8073dff8d" (UID: "d1a230c6-6844-4483-a8b4-0ae8073dff8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.246293 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1a230c6-6844-4483-a8b4-0ae8073dff8d" (UID: "d1a230c6-6844-4483-a8b4-0ae8073dff8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.284429 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-config-data" (OuterVolumeSpecName: "config-data") pod "d1a230c6-6844-4483-a8b4-0ae8073dff8d" (UID: "d1a230c6-6844-4483-a8b4-0ae8073dff8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.310195 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.310229 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mfl2\" (UniqueName: \"kubernetes.io/projected/d1a230c6-6844-4483-a8b4-0ae8073dff8d-kube-api-access-5mfl2\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.310268 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.310278 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a230c6-6844-4483-a8b4-0ae8073dff8d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.420508 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.514096 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-nb\") pod \"8f530e65-8397-49d6-929a-201bb5dfe585\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.514175 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-sb\") pod \"8f530e65-8397-49d6-929a-201bb5dfe585\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.514268 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-config\") pod \"8f530e65-8397-49d6-929a-201bb5dfe585\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.514294 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-svc\") pod \"8f530e65-8397-49d6-929a-201bb5dfe585\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.514317 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-swift-storage-0\") pod \"8f530e65-8397-49d6-929a-201bb5dfe585\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.514468 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kgkx\" (UniqueName: \"kubernetes.io/projected/8f530e65-8397-49d6-929a-201bb5dfe585-kube-api-access-5kgkx\") pod \"8f530e65-8397-49d6-929a-201bb5dfe585\" (UID: \"8f530e65-8397-49d6-929a-201bb5dfe585\") " Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.520509 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f530e65-8397-49d6-929a-201bb5dfe585-kube-api-access-5kgkx" (OuterVolumeSpecName: "kube-api-access-5kgkx") pod "8f530e65-8397-49d6-929a-201bb5dfe585" (UID: "8f530e65-8397-49d6-929a-201bb5dfe585"). InnerVolumeSpecName "kube-api-access-5kgkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.545711 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dzq9w" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.546743 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dzq9w" event={"ID":"d1a230c6-6844-4483-a8b4-0ae8073dff8d","Type":"ContainerDied","Data":"bdba7561ab6578109f42cf06a77bcff40e40ca9d14ad2f78b3e4e5201c687f05"} Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.546811 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdba7561ab6578109f42cf06a77bcff40e40ca9d14ad2f78b3e4e5201c687f05" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.550090 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.550702 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-2g6g8" event={"ID":"8f530e65-8397-49d6-929a-201bb5dfe585","Type":"ContainerDied","Data":"6a529cc3a96af23463f3dfa462bf02cb46f29fb8e36534fccb322ef7ab7a6728"} Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.550747 4722 scope.go:117] "RemoveContainer" containerID="b0f785695269b6ae9fc48dfba62c1a732aa42aadccca7a02f2d798ea3429fbac" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.591346 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-config" (OuterVolumeSpecName: "config") pod "8f530e65-8397-49d6-929a-201bb5dfe585" (UID: "8f530e65-8397-49d6-929a-201bb5dfe585"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.608756 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8f530e65-8397-49d6-929a-201bb5dfe585" (UID: "8f530e65-8397-49d6-929a-201bb5dfe585"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.617933 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.617961 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.617971 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kgkx\" (UniqueName: \"kubernetes.io/projected/8f530e65-8397-49d6-929a-201bb5dfe585-kube-api-access-5kgkx\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.632658 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8f530e65-8397-49d6-929a-201bb5dfe585" (UID: "8f530e65-8397-49d6-929a-201bb5dfe585"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.655730 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8f530e65-8397-49d6-929a-201bb5dfe585" (UID: "8f530e65-8397-49d6-929a-201bb5dfe585"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.668730 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8f530e65-8397-49d6-929a-201bb5dfe585" (UID: "8f530e65-8397-49d6-929a-201bb5dfe585"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.703454 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.719823 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.719856 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.719869 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8f530e65-8397-49d6-929a-201bb5dfe585-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.733102 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.733365 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e3ea5e74-4865-4550-bf03-3214021a9cda" containerName="nova-metadata-log" containerID="cri-o://24e4af34fca0eb266fb1f08b79e4f8ece3a7aaae432851469c316582f3b8b813" gracePeriod=30 Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.733799 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e3ea5e74-4865-4550-bf03-3214021a9cda" containerName="nova-metadata-metadata" containerID="cri-o://808417d00e6b28c9c6aa522c1977a475326ba92e3cf741e98e583db7b0e9115f" gracePeriod=30 Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.741317 4722 scope.go:117] "RemoveContainer" containerID="380c536ebfd3cf4e5ded9eb26bb64cd838a985f8d5ba0c199a97d05a07b511f3" Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.754107 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.754338 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerName="nova-api-log" containerID="cri-o://33c264ae3ae6e4eeb0fddc45a932c5dedfb68e7bb87b529cf2bce1cde21556b3" gracePeriod=30 Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.754464 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerName="nova-api-api" containerID="cri-o://e1203e3353e1b22d14cf15e5511afb0b51de1a779175f10f5d565c0c112db8ec" gracePeriod=30 Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.885621 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-2g6g8"] Feb 19 19:41:01 crc kubenswrapper[4722]: I0219 19:41:01.895400 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-2g6g8"] Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.349932 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.350137 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="14a7aae0-6a51-49ed-b4dd-9b274885d1da" containerName="kube-state-metrics" containerID="cri-o://3a2f38c278decbb381ff361931bea01935f3b90be53c0932153ee1cc0d0759f2" gracePeriod=30 Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.559485 4722 generic.go:334] "Generic (PLEG): container finished" podID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerID="33c264ae3ae6e4eeb0fddc45a932c5dedfb68e7bb87b529cf2bce1cde21556b3" exitCode=143 Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.559574 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a47ca62a-2546-47ec-80f7-1aa7e739e43e","Type":"ContainerDied","Data":"33c264ae3ae6e4eeb0fddc45a932c5dedfb68e7bb87b529cf2bce1cde21556b3"} Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.561724 4722 generic.go:334] "Generic (PLEG): container finished" podID="14a7aae0-6a51-49ed-b4dd-9b274885d1da" containerID="3a2f38c278decbb381ff361931bea01935f3b90be53c0932153ee1cc0d0759f2" exitCode=2 Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.561799 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"14a7aae0-6a51-49ed-b4dd-9b274885d1da","Type":"ContainerDied","Data":"3a2f38c278decbb381ff361931bea01935f3b90be53c0932153ee1cc0d0759f2"} Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.565579 4722 generic.go:334] "Generic (PLEG): container finished" podID="e3ea5e74-4865-4550-bf03-3214021a9cda" containerID="808417d00e6b28c9c6aa522c1977a475326ba92e3cf741e98e583db7b0e9115f" exitCode=0 Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.565607 4722 generic.go:334] "Generic (PLEG): container finished" podID="e3ea5e74-4865-4550-bf03-3214021a9cda" containerID="24e4af34fca0eb266fb1f08b79e4f8ece3a7aaae432851469c316582f3b8b813" exitCode=143 Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.565658 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3ea5e74-4865-4550-bf03-3214021a9cda","Type":"ContainerDied","Data":"808417d00e6b28c9c6aa522c1977a475326ba92e3cf741e98e583db7b0e9115f"} Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.565698 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3ea5e74-4865-4550-bf03-3214021a9cda","Type":"ContainerDied","Data":"24e4af34fca0eb266fb1f08b79e4f8ece3a7aaae432851469c316582f3b8b813"} Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.565751 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0b7f5812-df88-4652-85af-75b6b7f994ee" containerName="nova-scheduler-scheduler" containerID="cri-o://87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4" gracePeriod=30 Feb 19 19:41:02 crc kubenswrapper[4722]: I0219 19:41:02.967047 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.043297 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l498t\" (UniqueName: \"kubernetes.io/projected/e3ea5e74-4865-4550-bf03-3214021a9cda-kube-api-access-l498t\") pod \"e3ea5e74-4865-4550-bf03-3214021a9cda\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.043548 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-config-data\") pod \"e3ea5e74-4865-4550-bf03-3214021a9cda\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.043677 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-combined-ca-bundle\") pod \"e3ea5e74-4865-4550-bf03-3214021a9cda\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.043713 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-nova-metadata-tls-certs\") pod \"e3ea5e74-4865-4550-bf03-3214021a9cda\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.043765 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3ea5e74-4865-4550-bf03-3214021a9cda-logs\") pod \"e3ea5e74-4865-4550-bf03-3214021a9cda\" (UID: \"e3ea5e74-4865-4550-bf03-3214021a9cda\") " Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.044523 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3ea5e74-4865-4550-bf03-3214021a9cda-logs" (OuterVolumeSpecName: "logs") pod "e3ea5e74-4865-4550-bf03-3214021a9cda" (UID: "e3ea5e74-4865-4550-bf03-3214021a9cda"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.051344 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3ea5e74-4865-4550-bf03-3214021a9cda-kube-api-access-l498t" (OuterVolumeSpecName: "kube-api-access-l498t") pod "e3ea5e74-4865-4550-bf03-3214021a9cda" (UID: "e3ea5e74-4865-4550-bf03-3214021a9cda"). InnerVolumeSpecName "kube-api-access-l498t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.074793 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3ea5e74-4865-4550-bf03-3214021a9cda" (UID: "e3ea5e74-4865-4550-bf03-3214021a9cda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.089929 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-config-data" (OuterVolumeSpecName: "config-data") pod "e3ea5e74-4865-4550-bf03-3214021a9cda" (UID: "e3ea5e74-4865-4550-bf03-3214021a9cda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.104043 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f530e65-8397-49d6-929a-201bb5dfe585" path="/var/lib/kubelet/pods/8f530e65-8397-49d6-929a-201bb5dfe585/volumes" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.104200 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e3ea5e74-4865-4550-bf03-3214021a9cda" (UID: "e3ea5e74-4865-4550-bf03-3214021a9cda"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.147964 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.148003 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.148015 4722 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3ea5e74-4865-4550-bf03-3214021a9cda-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.148025 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3ea5e74-4865-4550-bf03-3214021a9cda-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.148036 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l498t\" (UniqueName: \"kubernetes.io/projected/e3ea5e74-4865-4550-bf03-3214021a9cda-kube-api-access-l498t\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.585784 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3ea5e74-4865-4550-bf03-3214021a9cda","Type":"ContainerDied","Data":"2bf1295d63e927ccd4e4559a63af3638bc9a37a9047410b86f7de1d3bfe4945d"} Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.585844 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.586097 4722 scope.go:117] "RemoveContainer" containerID="808417d00e6b28c9c6aa522c1977a475326ba92e3cf741e98e583db7b0e9115f" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.755451 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.770099 4722 scope.go:117] "RemoveContainer" containerID="24e4af34fca0eb266fb1f08b79e4f8ece3a7aaae432851469c316582f3b8b813" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.772254 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.783569 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.794296 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:03 crc kubenswrapper[4722]: E0219 19:41:03.794707 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a230c6-6844-4483-a8b4-0ae8073dff8d" containerName="nova-manage" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.794722 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a230c6-6844-4483-a8b4-0ae8073dff8d" containerName="nova-manage" Feb 19 19:41:03 crc kubenswrapper[4722]: E0219 19:41:03.794737 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a7aae0-6a51-49ed-b4dd-9b274885d1da" containerName="kube-state-metrics" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.794743 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a7aae0-6a51-49ed-b4dd-9b274885d1da" containerName="kube-state-metrics" Feb 19 19:41:03 crc kubenswrapper[4722]: E0219 19:41:03.794766 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ea5e74-4865-4550-bf03-3214021a9cda" containerName="nova-metadata-metadata" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.794772 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ea5e74-4865-4550-bf03-3214021a9cda" containerName="nova-metadata-metadata" Feb 19 19:41:03 crc kubenswrapper[4722]: E0219 19:41:03.794785 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ea5e74-4865-4550-bf03-3214021a9cda" containerName="nova-metadata-log" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.794791 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ea5e74-4865-4550-bf03-3214021a9cda" containerName="nova-metadata-log" Feb 19 19:41:03 crc kubenswrapper[4722]: E0219 19:41:03.794806 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f530e65-8397-49d6-929a-201bb5dfe585" containerName="init" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.794812 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f530e65-8397-49d6-929a-201bb5dfe585" containerName="init" Feb 19 19:41:03 crc kubenswrapper[4722]: E0219 19:41:03.794827 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f530e65-8397-49d6-929a-201bb5dfe585" containerName="dnsmasq-dns" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.794833 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f530e65-8397-49d6-929a-201bb5dfe585" containerName="dnsmasq-dns" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.794998 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f530e65-8397-49d6-929a-201bb5dfe585" containerName="dnsmasq-dns" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.795011 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ea5e74-4865-4550-bf03-3214021a9cda" containerName="nova-metadata-metadata" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.795018 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a7aae0-6a51-49ed-b4dd-9b274885d1da" containerName="kube-state-metrics" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.795032 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ea5e74-4865-4550-bf03-3214021a9cda" containerName="nova-metadata-log" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.795045 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1a230c6-6844-4483-a8b4-0ae8073dff8d" containerName="nova-manage" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.796069 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.804187 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.804318 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.830064 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.865067 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c6z8\" (UniqueName: \"kubernetes.io/projected/14a7aae0-6a51-49ed-b4dd-9b274885d1da-kube-api-access-8c6z8\") pod \"14a7aae0-6a51-49ed-b4dd-9b274885d1da\" (UID: \"14a7aae0-6a51-49ed-b4dd-9b274885d1da\") " Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.865398 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9cmx\" (UniqueName: \"kubernetes.io/projected/3f9140da-76d7-4109-9892-23c1ceb60eaa-kube-api-access-z9cmx\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.865473 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.865557 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.865591 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-config-data\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.865620 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f9140da-76d7-4109-9892-23c1ceb60eaa-logs\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.890732 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a7aae0-6a51-49ed-b4dd-9b274885d1da-kube-api-access-8c6z8" (OuterVolumeSpecName: "kube-api-access-8c6z8") pod "14a7aae0-6a51-49ed-b4dd-9b274885d1da" (UID: "14a7aae0-6a51-49ed-b4dd-9b274885d1da"). InnerVolumeSpecName "kube-api-access-8c6z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.967932 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.968040 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.968073 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-config-data\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.968104 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f9140da-76d7-4109-9892-23c1ceb60eaa-logs\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.968163 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9cmx\" (UniqueName: \"kubernetes.io/projected/3f9140da-76d7-4109-9892-23c1ceb60eaa-kube-api-access-z9cmx\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.968223 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c6z8\" (UniqueName: \"kubernetes.io/projected/14a7aae0-6a51-49ed-b4dd-9b274885d1da-kube-api-access-8c6z8\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.968690 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f9140da-76d7-4109-9892-23c1ceb60eaa-logs\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.973611 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.979466 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.983719 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9cmx\" (UniqueName: \"kubernetes.io/projected/3f9140da-76d7-4109-9892-23c1ceb60eaa-kube-api-access-z9cmx\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:03 crc kubenswrapper[4722]: I0219 19:41:03.986829 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-config-data\") pod \"nova-metadata-0\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " pod="openstack/nova-metadata-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.131406 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.146292 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.172404 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-config-data\") pod \"0b7f5812-df88-4652-85af-75b6b7f994ee\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.172607 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-combined-ca-bundle\") pod \"0b7f5812-df88-4652-85af-75b6b7f994ee\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.172633 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lfmc\" (UniqueName: \"kubernetes.io/projected/0b7f5812-df88-4652-85af-75b6b7f994ee-kube-api-access-9lfmc\") pod \"0b7f5812-df88-4652-85af-75b6b7f994ee\" (UID: \"0b7f5812-df88-4652-85af-75b6b7f994ee\") " Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.212706 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b7f5812-df88-4652-85af-75b6b7f994ee-kube-api-access-9lfmc" (OuterVolumeSpecName: "kube-api-access-9lfmc") pod "0b7f5812-df88-4652-85af-75b6b7f994ee" (UID: "0b7f5812-df88-4652-85af-75b6b7f994ee"). InnerVolumeSpecName "kube-api-access-9lfmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.224691 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b7f5812-df88-4652-85af-75b6b7f994ee" (UID: "0b7f5812-df88-4652-85af-75b6b7f994ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.238596 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-config-data" (OuterVolumeSpecName: "config-data") pod "0b7f5812-df88-4652-85af-75b6b7f994ee" (UID: "0b7f5812-df88-4652-85af-75b6b7f994ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.275841 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.276186 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b7f5812-df88-4652-85af-75b6b7f994ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.276198 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lfmc\" (UniqueName: \"kubernetes.io/projected/0b7f5812-df88-4652-85af-75b6b7f994ee-kube-api-access-9lfmc\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.596952 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"14a7aae0-6a51-49ed-b4dd-9b274885d1da","Type":"ContainerDied","Data":"5545dc8f3e2de249c7840626da07d4ee4ba5dd553856353617c9f89c2873d54d"} Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.596979 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.597015 4722 scope.go:117] "RemoveContainer" containerID="3a2f38c278decbb381ff361931bea01935f3b90be53c0932153ee1cc0d0759f2" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.599296 4722 generic.go:334] "Generic (PLEG): container finished" podID="0b7f5812-df88-4652-85af-75b6b7f994ee" containerID="87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4" exitCode=0 Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.599377 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0b7f5812-df88-4652-85af-75b6b7f994ee","Type":"ContainerDied","Data":"87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4"} Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.599381 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.599418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0b7f5812-df88-4652-85af-75b6b7f994ee","Type":"ContainerDied","Data":"3e855c266781287503b5b752c8ff71d55312447fd33f6883decf51bbec4b4e45"} Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.630735 4722 scope.go:117] "RemoveContainer" containerID="87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.635515 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.649124 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.661015 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.682728 4722 scope.go:117] "RemoveContainer" containerID="87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4" Feb 19 19:41:04 crc kubenswrapper[4722]: E0219 19:41:04.683364 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4\": container with ID starting with 87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4 not found: ID does not exist" containerID="87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.683405 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4"} err="failed to get container status \"87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4\": rpc error: code = NotFound desc = could not find container \"87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4\": container with ID starting with 87d72502795bc34e3b1eb6602892a8ee62ab9879fce6184883b6f07db34c63d4 not found: ID does not exist" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.692213 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.702644 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:04 crc kubenswrapper[4722]: E0219 19:41:04.703119 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7f5812-df88-4652-85af-75b6b7f994ee" containerName="nova-scheduler-scheduler" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.703131 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7f5812-df88-4652-85af-75b6b7f994ee" containerName="nova-scheduler-scheduler" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.703341 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b7f5812-df88-4652-85af-75b6b7f994ee" containerName="nova-scheduler-scheduler" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.704066 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.706029 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.713554 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.723256 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.724642 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.732229 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.732600 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.732872 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.742681 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.807277 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77ghz\" (UniqueName: \"kubernetes.io/projected/f8493c9f-328a-446d-8110-5879a7aedd2b-kube-api-access-77ghz\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.807365 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8493c9f-328a-446d-8110-5879a7aedd2b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.807416 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8493c9f-328a-446d-8110-5879a7aedd2b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.807480 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.807502 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjj24\" (UniqueName: \"kubernetes.io/projected/339423c2-068b-48f8-8117-04f6a37ceaf9-kube-api-access-qjj24\") pod \"nova-scheduler-0\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.807539 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f8493c9f-328a-446d-8110-5879a7aedd2b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.807559 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-config-data\") pod \"nova-scheduler-0\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.909587 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77ghz\" (UniqueName: \"kubernetes.io/projected/f8493c9f-328a-446d-8110-5879a7aedd2b-kube-api-access-77ghz\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.909653 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8493c9f-328a-446d-8110-5879a7aedd2b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.909710 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8493c9f-328a-446d-8110-5879a7aedd2b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.909778 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.909797 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjj24\" (UniqueName: \"kubernetes.io/projected/339423c2-068b-48f8-8117-04f6a37ceaf9-kube-api-access-qjj24\") pod \"nova-scheduler-0\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.911622 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f8493c9f-328a-446d-8110-5879a7aedd2b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.911655 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-config-data\") pod \"nova-scheduler-0\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.919285 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f8493c9f-328a-446d-8110-5879a7aedd2b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.920875 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8493c9f-328a-446d-8110-5879a7aedd2b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.923818 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.927637 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-config-data\") pod \"nova-scheduler-0\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.939849 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77ghz\" (UniqueName: \"kubernetes.io/projected/f8493c9f-328a-446d-8110-5879a7aedd2b-kube-api-access-77ghz\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.948206 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8493c9f-328a-446d-8110-5879a7aedd2b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f8493c9f-328a-446d-8110-5879a7aedd2b\") " pod="openstack/kube-state-metrics-0" Feb 19 19:41:04 crc kubenswrapper[4722]: I0219 19:41:04.971956 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjj24\" (UniqueName: \"kubernetes.io/projected/339423c2-068b-48f8-8117-04f6a37ceaf9-kube-api-access-qjj24\") pod \"nova-scheduler-0\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.079886 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.093649 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b7f5812-df88-4652-85af-75b6b7f994ee" path="/var/lib/kubelet/pods/0b7f5812-df88-4652-85af-75b6b7f994ee/volumes" Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.097295 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14a7aae0-6a51-49ed-b4dd-9b274885d1da" path="/var/lib/kubelet/pods/14a7aae0-6a51-49ed-b4dd-9b274885d1da/volumes" Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.097506 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.097936 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3ea5e74-4865-4550-bf03-3214021a9cda" path="/var/lib/kubelet/pods/e3ea5e74-4865-4550-bf03-3214021a9cda/volumes" Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.611145 4722 generic.go:334] "Generic (PLEG): container finished" podID="106da00f-55de-4b4f-8a57-b8f0b1994c2f" containerID="b21bbae7a8949776700019b16cbaccbd427e9d7db723a1e91246a4178885c340" exitCode=0 Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.611322 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2nnkf" event={"ID":"106da00f-55de-4b4f-8a57-b8f0b1994c2f","Type":"ContainerDied","Data":"b21bbae7a8949776700019b16cbaccbd427e9d7db723a1e91246a4178885c340"} Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.624656 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9140da-76d7-4109-9892-23c1ceb60eaa","Type":"ContainerStarted","Data":"a02b0085e2ba8a5b6e93ff14529363efb51fc8a03bd2360fc27b7d63f1740346"} Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.624705 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9140da-76d7-4109-9892-23c1ceb60eaa","Type":"ContainerStarted","Data":"9e174e4ac1407291135c0ab1018e954feca197504199df9de4977de4b585f9e2"} Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.624718 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9140da-76d7-4109-9892-23c1ceb60eaa","Type":"ContainerStarted","Data":"1d5b78abbb5e2a59e1b1457349c70f190753858ae936d0ab19bb55cc724af44f"} Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.625568 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:05 crc kubenswrapper[4722]: W0219 19:41:05.644842 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8493c9f_328a_446d_8110_5879a7aedd2b.slice/crio-927a5739968991664c79d9545f0fce395a6bde39570d851a33fe5046a9b1d17e WatchSource:0}: Error finding container 927a5739968991664c79d9545f0fce395a6bde39570d851a33fe5046a9b1d17e: Status 404 returned error can't find the container with id 927a5739968991664c79d9545f0fce395a6bde39570d851a33fe5046a9b1d17e Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.651282 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.651661 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="ceilometer-central-agent" containerID="cri-o://89555d2cfdc64982f801988ef2297fc9a4c1bb04fb28bd06ae98ee1ecd56cd0a" gracePeriod=30 Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.651713 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="sg-core" containerID="cri-o://41cf9b6a09bc0c1ee74ae82fe251fd733a4e1f343ef37f5f87cd6dd3a0f419e1" gracePeriod=30 Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.651760 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="proxy-httpd" containerID="cri-o://c936b7b26f2e71263c106589ce3856aa7fd7d2a5e0f20bb894ab7d5bae77b099" gracePeriod=30 Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.651731 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="ceilometer-notification-agent" containerID="cri-o://88727b26de3648e330b8018601cf86477430e3aed456e602920db3b0c636f193" gracePeriod=30 Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.694689 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 19:41:05 crc kubenswrapper[4722]: I0219 19:41:05.710798 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.710781115 podStartE2EDuration="2.710781115s" podCreationTimestamp="2026-02-19 19:41:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:05.66743453 +0000 UTC m=+1365.279784854" watchObservedRunningTime="2026-02-19 19:41:05.710781115 +0000 UTC m=+1365.323131429" Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.648377 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"339423c2-068b-48f8-8117-04f6a37ceaf9","Type":"ContainerStarted","Data":"fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40"} Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.648969 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"339423c2-068b-48f8-8117-04f6a37ceaf9","Type":"ContainerStarted","Data":"b861b10dd2203a847e9aec1463641c9c77079ce2640617c70a1b02fbcb8c691f"} Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.657782 4722 generic.go:334] "Generic (PLEG): container finished" podID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerID="c936b7b26f2e71263c106589ce3856aa7fd7d2a5e0f20bb894ab7d5bae77b099" exitCode=0 Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.657812 4722 generic.go:334] "Generic (PLEG): container finished" podID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerID="41cf9b6a09bc0c1ee74ae82fe251fd733a4e1f343ef37f5f87cd6dd3a0f419e1" exitCode=2 Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.657822 4722 generic.go:334] "Generic (PLEG): container finished" podID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerID="88727b26de3648e330b8018601cf86477430e3aed456e602920db3b0c636f193" exitCode=0 Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.657829 4722 generic.go:334] "Generic (PLEG): container finished" podID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerID="89555d2cfdc64982f801988ef2297fc9a4c1bb04fb28bd06ae98ee1ecd56cd0a" exitCode=0 Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.657872 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c","Type":"ContainerDied","Data":"c936b7b26f2e71263c106589ce3856aa7fd7d2a5e0f20bb894ab7d5bae77b099"} Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.657896 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c","Type":"ContainerDied","Data":"41cf9b6a09bc0c1ee74ae82fe251fd733a4e1f343ef37f5f87cd6dd3a0f419e1"} Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.657905 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c","Type":"ContainerDied","Data":"88727b26de3648e330b8018601cf86477430e3aed456e602920db3b0c636f193"} Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.657913 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c","Type":"ContainerDied","Data":"89555d2cfdc64982f801988ef2297fc9a4c1bb04fb28bd06ae98ee1ecd56cd0a"} Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.660368 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f8493c9f-328a-446d-8110-5879a7aedd2b","Type":"ContainerStarted","Data":"5fb1969d69a84b92a4b54b843e64e8275f19bbb8264232f994396b4209fc340d"} Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.660425 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.660436 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f8493c9f-328a-446d-8110-5879a7aedd2b","Type":"ContainerStarted","Data":"927a5739968991664c79d9545f0fce395a6bde39570d851a33fe5046a9b1d17e"} Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.674988 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.674970107 podStartE2EDuration="2.674970107s" podCreationTimestamp="2026-02-19 19:41:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:06.668672141 +0000 UTC m=+1366.281022465" watchObservedRunningTime="2026-02-19 19:41:06.674970107 +0000 UTC m=+1366.287320431" Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.696512 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.212092656 podStartE2EDuration="2.69649406s" podCreationTimestamp="2026-02-19 19:41:04 +0000 UTC" firstStartedPulling="2026-02-19 19:41:05.648449867 +0000 UTC m=+1365.260800191" lastFinishedPulling="2026-02-19 19:41:06.132851251 +0000 UTC m=+1365.745201595" observedRunningTime="2026-02-19 19:41:06.684529916 +0000 UTC m=+1366.296880240" watchObservedRunningTime="2026-02-19 19:41:06.69649406 +0000 UTC m=+1366.308844384" Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.907952 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.958000 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-scripts\") pod \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.958113 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-run-httpd\") pod \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.958246 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hgsn\" (UniqueName: \"kubernetes.io/projected/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-kube-api-access-5hgsn\") pod \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.958315 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-config-data\") pod \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.958338 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-log-httpd\") pod \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.958369 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-sg-core-conf-yaml\") pod \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.958452 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-combined-ca-bundle\") pod \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\" (UID: \"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c\") " Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.960261 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" (UID: "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.961765 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" (UID: "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.965335 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-scripts" (OuterVolumeSpecName: "scripts") pod "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" (UID: "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:06 crc kubenswrapper[4722]: I0219 19:41:06.968627 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-kube-api-access-5hgsn" (OuterVolumeSpecName: "kube-api-access-5hgsn") pod "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" (UID: "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c"). InnerVolumeSpecName "kube-api-access-5hgsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.032397 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" (UID: "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.054661 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.064040 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.064066 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.064077 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.064086 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.064096 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hgsn\" (UniqueName: \"kubernetes.io/projected/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-kube-api-access-5hgsn\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.075767 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" (UID: "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.128058 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-config-data" (OuterVolumeSpecName: "config-data") pod "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" (UID: "dfd4ffd8-1f63-4881-9774-9dda64b8ae5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.165786 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-combined-ca-bundle\") pod \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.166411 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6p5b\" (UniqueName: \"kubernetes.io/projected/106da00f-55de-4b4f-8a57-b8f0b1994c2f-kube-api-access-q6p5b\") pod \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.166630 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-config-data\") pod \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.166782 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-scripts\") pod \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\" (UID: \"106da00f-55de-4b4f-8a57-b8f0b1994c2f\") " Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.167734 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.167911 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.170853 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/106da00f-55de-4b4f-8a57-b8f0b1994c2f-kube-api-access-q6p5b" (OuterVolumeSpecName: "kube-api-access-q6p5b") pod "106da00f-55de-4b4f-8a57-b8f0b1994c2f" (UID: "106da00f-55de-4b4f-8a57-b8f0b1994c2f"). InnerVolumeSpecName "kube-api-access-q6p5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.173270 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-scripts" (OuterVolumeSpecName: "scripts") pod "106da00f-55de-4b4f-8a57-b8f0b1994c2f" (UID: "106da00f-55de-4b4f-8a57-b8f0b1994c2f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.195356 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "106da00f-55de-4b4f-8a57-b8f0b1994c2f" (UID: "106da00f-55de-4b4f-8a57-b8f0b1994c2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.223963 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-config-data" (OuterVolumeSpecName: "config-data") pod "106da00f-55de-4b4f-8a57-b8f0b1994c2f" (UID: "106da00f-55de-4b4f-8a57-b8f0b1994c2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.276192 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.276224 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6p5b\" (UniqueName: \"kubernetes.io/projected/106da00f-55de-4b4f-8a57-b8f0b1994c2f-kube-api-access-q6p5b\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.276257 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.276267 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/106da00f-55de-4b4f-8a57-b8f0b1994c2f-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.673303 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2nnkf" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.674380 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2nnkf" event={"ID":"106da00f-55de-4b4f-8a57-b8f0b1994c2f","Type":"ContainerDied","Data":"facd2fa08899d3f5f781d5d5581a5b5e4e0814ddf689e925edac6d9674d3eba2"} Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.674426 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="facd2fa08899d3f5f781d5d5581a5b5e4e0814ddf689e925edac6d9674d3eba2" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.679103 4722 generic.go:334] "Generic (PLEG): container finished" podID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerID="e1203e3353e1b22d14cf15e5511afb0b51de1a779175f10f5d565c0c112db8ec" exitCode=0 Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.679178 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a47ca62a-2546-47ec-80f7-1aa7e739e43e","Type":"ContainerDied","Data":"e1203e3353e1b22d14cf15e5511afb0b51de1a779175f10f5d565c0c112db8ec"} Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.679201 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a47ca62a-2546-47ec-80f7-1aa7e739e43e","Type":"ContainerDied","Data":"e752336bd0fae94c244c48636db8246622b4108d65cd7eef8e5be29938427e5c"} Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.679211 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e752336bd0fae94c244c48636db8246622b4108d65cd7eef8e5be29938427e5c" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.686323 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.686133 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dfd4ffd8-1f63-4881-9774-9dda64b8ae5c","Type":"ContainerDied","Data":"0b578e9539c84fc7d8484077c8a6c06daff6b530c3c3b8f5fb43051dd535101b"} Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.693018 4722 scope.go:117] "RemoveContainer" containerID="c936b7b26f2e71263c106589ce3856aa7fd7d2a5e0f20bb894ab7d5bae77b099" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.743845 4722 scope.go:117] "RemoveContainer" containerID="41cf9b6a09bc0c1ee74ae82fe251fd733a4e1f343ef37f5f87cd6dd3a0f419e1" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.745300 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 19:41:07 crc kubenswrapper[4722]: E0219 19:41:07.745780 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="proxy-httpd" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.745796 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="proxy-httpd" Feb 19 19:41:07 crc kubenswrapper[4722]: E0219 19:41:07.745805 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="ceilometer-notification-agent" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.745811 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="ceilometer-notification-agent" Feb 19 19:41:07 crc kubenswrapper[4722]: E0219 19:41:07.745835 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106da00f-55de-4b4f-8a57-b8f0b1994c2f" containerName="nova-cell1-conductor-db-sync" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.745841 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="106da00f-55de-4b4f-8a57-b8f0b1994c2f" containerName="nova-cell1-conductor-db-sync" Feb 19 19:41:07 crc kubenswrapper[4722]: E0219 19:41:07.745854 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="ceilometer-central-agent" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.745859 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="ceilometer-central-agent" Feb 19 19:41:07 crc kubenswrapper[4722]: E0219 19:41:07.745868 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="sg-core" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.745874 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="sg-core" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.746039 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="sg-core" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.746051 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="ceilometer-central-agent" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.746064 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="106da00f-55de-4b4f-8a57-b8f0b1994c2f" containerName="nova-cell1-conductor-db-sync" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.746081 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="proxy-httpd" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.746095 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" containerName="ceilometer-notification-agent" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.746900 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.748919 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.750542 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.773348 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.783449 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.788061 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-config-data\") pod \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.788212 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-combined-ca-bundle\") pod \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.788294 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a47ca62a-2546-47ec-80f7-1aa7e739e43e-logs\") pod \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.788328 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc6h5\" (UniqueName: \"kubernetes.io/projected/a47ca62a-2546-47ec-80f7-1aa7e739e43e-kube-api-access-qc6h5\") pod \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\" (UID: \"a47ca62a-2546-47ec-80f7-1aa7e739e43e\") " Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.788569 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7880856-0db7-4bbf-9202-04f90868fc1d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f7880856-0db7-4bbf-9202-04f90868fc1d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.788803 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7880856-0db7-4bbf-9202-04f90868fc1d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f7880856-0db7-4bbf-9202-04f90868fc1d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.788932 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmplr\" (UniqueName: \"kubernetes.io/projected/f7880856-0db7-4bbf-9202-04f90868fc1d-kube-api-access-hmplr\") pod \"nova-cell1-conductor-0\" (UID: \"f7880856-0db7-4bbf-9202-04f90868fc1d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.788966 4722 scope.go:117] "RemoveContainer" containerID="88727b26de3648e330b8018601cf86477430e3aed456e602920db3b0c636f193" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.795425 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a47ca62a-2546-47ec-80f7-1aa7e739e43e-logs" (OuterVolumeSpecName: "logs") pod "a47ca62a-2546-47ec-80f7-1aa7e739e43e" (UID: "a47ca62a-2546-47ec-80f7-1aa7e739e43e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.820803 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a47ca62a-2546-47ec-80f7-1aa7e739e43e-kube-api-access-qc6h5" (OuterVolumeSpecName: "kube-api-access-qc6h5") pod "a47ca62a-2546-47ec-80f7-1aa7e739e43e" (UID: "a47ca62a-2546-47ec-80f7-1aa7e739e43e"). InnerVolumeSpecName "kube-api-access-qc6h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.831975 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.837459 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a47ca62a-2546-47ec-80f7-1aa7e739e43e" (UID: "a47ca62a-2546-47ec-80f7-1aa7e739e43e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.901495 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmplr\" (UniqueName: \"kubernetes.io/projected/f7880856-0db7-4bbf-9202-04f90868fc1d-kube-api-access-hmplr\") pod \"nova-cell1-conductor-0\" (UID: \"f7880856-0db7-4bbf-9202-04f90868fc1d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.901588 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7880856-0db7-4bbf-9202-04f90868fc1d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f7880856-0db7-4bbf-9202-04f90868fc1d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.901666 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7880856-0db7-4bbf-9202-04f90868fc1d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f7880856-0db7-4bbf-9202-04f90868fc1d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.901721 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.901731 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a47ca62a-2546-47ec-80f7-1aa7e739e43e-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.901741 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc6h5\" (UniqueName: \"kubernetes.io/projected/a47ca62a-2546-47ec-80f7-1aa7e739e43e-kube-api-access-qc6h5\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.905598 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:07 crc kubenswrapper[4722]: E0219 19:41:07.906364 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerName="nova-api-log" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.906383 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerName="nova-api-log" Feb 19 19:41:07 crc kubenswrapper[4722]: E0219 19:41:07.906427 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerName="nova-api-api" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.906434 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerName="nova-api-api" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.906699 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerName="nova-api-log" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.906733 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" containerName="nova-api-api" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.910366 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.914511 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.914740 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.914948 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.917343 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7880856-0db7-4bbf-9202-04f90868fc1d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f7880856-0db7-4bbf-9202-04f90868fc1d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.924799 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7880856-0db7-4bbf-9202-04f90868fc1d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f7880856-0db7-4bbf-9202-04f90868fc1d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.926203 4722 scope.go:117] "RemoveContainer" containerID="89555d2cfdc64982f801988ef2297fc9a4c1bb04fb28bd06ae98ee1ecd56cd0a" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.938735 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmplr\" (UniqueName: \"kubernetes.io/projected/f7880856-0db7-4bbf-9202-04f90868fc1d-kube-api-access-hmplr\") pod \"nova-cell1-conductor-0\" (UID: \"f7880856-0db7-4bbf-9202-04f90868fc1d\") " pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.942370 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-config-data" (OuterVolumeSpecName: "config-data") pod "a47ca62a-2546-47ec-80f7-1aa7e739e43e" (UID: "a47ca62a-2546-47ec-80f7-1aa7e739e43e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:07 crc kubenswrapper[4722]: I0219 19:41:07.987386 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.003596 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.003645 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfrfq\" (UniqueName: \"kubernetes.io/projected/35309d31-c095-492f-8645-f99a629dafd5-kube-api-access-vfrfq\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.003739 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.003880 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-scripts\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.003918 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.003994 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-config-data\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.004017 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-run-httpd\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.004037 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-log-httpd\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.004095 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a47ca62a-2546-47ec-80f7-1aa7e739e43e-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.091631 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.106296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.106411 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-config-data\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.106436 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-run-httpd\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.106455 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-log-httpd\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.106642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.106685 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfrfq\" (UniqueName: \"kubernetes.io/projected/35309d31-c095-492f-8645-f99a629dafd5-kube-api-access-vfrfq\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.106710 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.106808 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-scripts\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.111268 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-log-httpd\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.111586 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-scripts\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.111808 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-run-httpd\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.113203 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.115488 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.119425 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.124215 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-config-data\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.132733 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfrfq\" (UniqueName: \"kubernetes.io/projected/35309d31-c095-492f-8645-f99a629dafd5-kube-api-access-vfrfq\") pod \"ceilometer-0\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.254698 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.641353 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.704683 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f7880856-0db7-4bbf-9202-04f90868fc1d","Type":"ContainerStarted","Data":"bed3c002650ca5de4b056281524f0247c520fc23a59bc8cb6c23476a7afb5560"} Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.704757 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.790814 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:08 crc kubenswrapper[4722]: W0219 19:41:08.791583 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35309d31_c095_492f_8645_f99a629dafd5.slice/crio-c75a051258432020e538a8b547aaf892bef9988a1634efdd8324c9ff11bd5121 WatchSource:0}: Error finding container c75a051258432020e538a8b547aaf892bef9988a1634efdd8324c9ff11bd5121: Status 404 returned error can't find the container with id c75a051258432020e538a8b547aaf892bef9988a1634efdd8324c9ff11bd5121 Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.829678 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.845215 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.859014 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.862733 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.866013 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.882982 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.926465 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78gnk\" (UniqueName: \"kubernetes.io/projected/96772512-a8ae-42f5-b8ce-748d1115c4ef-kube-api-access-78gnk\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.926603 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-config-data\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.926626 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:08 crc kubenswrapper[4722]: I0219 19:41:08.926682 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96772512-a8ae-42f5-b8ce-748d1115c4ef-logs\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.028078 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78gnk\" (UniqueName: \"kubernetes.io/projected/96772512-a8ae-42f5-b8ce-748d1115c4ef-kube-api-access-78gnk\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.028256 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-config-data\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.028317 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.028400 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96772512-a8ae-42f5-b8ce-748d1115c4ef-logs\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.028930 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96772512-a8ae-42f5-b8ce-748d1115c4ef-logs\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.033085 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-config-data\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.033225 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.044815 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78gnk\" (UniqueName: \"kubernetes.io/projected/96772512-a8ae-42f5-b8ce-748d1115c4ef-kube-api-access-78gnk\") pod \"nova-api-0\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " pod="openstack/nova-api-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.082513 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a47ca62a-2546-47ec-80f7-1aa7e739e43e" path="/var/lib/kubelet/pods/a47ca62a-2546-47ec-80f7-1aa7e739e43e/volumes" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.083182 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfd4ffd8-1f63-4881-9774-9dda64b8ae5c" path="/var/lib/kubelet/pods/dfd4ffd8-1f63-4881-9774-9dda64b8ae5c/volumes" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.132172 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.132285 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.201719 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:09 crc kubenswrapper[4722]: W0219 19:41:09.701867 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96772512_a8ae_42f5_b8ce_748d1115c4ef.slice/crio-0fe3f455ace2c4dc1bef255cba16bf417ac0ad0b75a12854a770b0ccf14ad7e7 WatchSource:0}: Error finding container 0fe3f455ace2c4dc1bef255cba16bf417ac0ad0b75a12854a770b0ccf14ad7e7: Status 404 returned error can't find the container with id 0fe3f455ace2c4dc1bef255cba16bf417ac0ad0b75a12854a770b0ccf14ad7e7 Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.702471 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.730025 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f7880856-0db7-4bbf-9202-04f90868fc1d","Type":"ContainerStarted","Data":"b783b9cc92c606a5cb1cba333ad7c569842c37085e96bb067ddeb43d5dffefa1"} Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.739169 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.752050 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35309d31-c095-492f-8645-f99a629dafd5","Type":"ContainerStarted","Data":"f9dabf9de0a02c3036feae3654812d6b4b50e934a33a2041e4126a20558ce346"} Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.752108 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35309d31-c095-492f-8645-f99a629dafd5","Type":"ContainerStarted","Data":"c75a051258432020e538a8b547aaf892bef9988a1634efdd8324c9ff11bd5121"} Feb 19 19:41:09 crc kubenswrapper[4722]: I0219 19:41:09.757140 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.757118988 podStartE2EDuration="2.757118988s" podCreationTimestamp="2026-02-19 19:41:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:09.750024667 +0000 UTC m=+1369.362375011" watchObservedRunningTime="2026-02-19 19:41:09.757118988 +0000 UTC m=+1369.369469312" Feb 19 19:41:10 crc kubenswrapper[4722]: I0219 19:41:10.080175 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 19:41:10 crc kubenswrapper[4722]: I0219 19:41:10.762908 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35309d31-c095-492f-8645-f99a629dafd5","Type":"ContainerStarted","Data":"aad5d7ab0f17d255e8f33e2fc5558128806cd86a1f59e6e4f901792e4723d331"} Feb 19 19:41:10 crc kubenswrapper[4722]: I0219 19:41:10.763300 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35309d31-c095-492f-8645-f99a629dafd5","Type":"ContainerStarted","Data":"797f3dbcc096d3c7dab6d3e2bf6dbfd40f7157e1f303fed3bdb44a3590ac7443"} Feb 19 19:41:10 crc kubenswrapper[4722]: I0219 19:41:10.765943 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96772512-a8ae-42f5-b8ce-748d1115c4ef","Type":"ContainerStarted","Data":"8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea"} Feb 19 19:41:10 crc kubenswrapper[4722]: I0219 19:41:10.765994 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96772512-a8ae-42f5-b8ce-748d1115c4ef","Type":"ContainerStarted","Data":"3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68"} Feb 19 19:41:10 crc kubenswrapper[4722]: I0219 19:41:10.766010 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96772512-a8ae-42f5-b8ce-748d1115c4ef","Type":"ContainerStarted","Data":"0fe3f455ace2c4dc1bef255cba16bf417ac0ad0b75a12854a770b0ccf14ad7e7"} Feb 19 19:41:10 crc kubenswrapper[4722]: I0219 19:41:10.781770 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.781755169 podStartE2EDuration="2.781755169s" podCreationTimestamp="2026-02-19 19:41:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:10.780403228 +0000 UTC m=+1370.392753552" watchObservedRunningTime="2026-02-19 19:41:10.781755169 +0000 UTC m=+1370.394105493" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.004018 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2vsgx"] Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.006855 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.029352 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2vsgx"] Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.124670 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-catalog-content\") pod \"redhat-operators-2vsgx\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.124782 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rrkx\" (UniqueName: \"kubernetes.io/projected/5a7bcc56-4611-489d-8f1b-2105503393de-kube-api-access-8rrkx\") pod \"redhat-operators-2vsgx\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.124839 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-utilities\") pod \"redhat-operators-2vsgx\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.226905 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rrkx\" (UniqueName: \"kubernetes.io/projected/5a7bcc56-4611-489d-8f1b-2105503393de-kube-api-access-8rrkx\") pod \"redhat-operators-2vsgx\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.227023 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-utilities\") pod \"redhat-operators-2vsgx\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.227205 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-catalog-content\") pod \"redhat-operators-2vsgx\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.227566 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-utilities\") pod \"redhat-operators-2vsgx\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.227704 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-catalog-content\") pod \"redhat-operators-2vsgx\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.247309 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rrkx\" (UniqueName: \"kubernetes.io/projected/5a7bcc56-4611-489d-8f1b-2105503393de-kube-api-access-8rrkx\") pod \"redhat-operators-2vsgx\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.341921 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:12 crc kubenswrapper[4722]: W0219 19:41:12.922372 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a7bcc56_4611_489d_8f1b_2105503393de.slice/crio-9b0e003e8db1c8ad9f32db1f4c8f753e186cbb9352d3fba619af370f91abf338 WatchSource:0}: Error finding container 9b0e003e8db1c8ad9f32db1f4c8f753e186cbb9352d3fba619af370f91abf338: Status 404 returned error can't find the container with id 9b0e003e8db1c8ad9f32db1f4c8f753e186cbb9352d3fba619af370f91abf338 Feb 19 19:41:12 crc kubenswrapper[4722]: I0219 19:41:12.923664 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2vsgx"] Feb 19 19:41:13 crc kubenswrapper[4722]: I0219 19:41:13.128844 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 19 19:41:13 crc kubenswrapper[4722]: I0219 19:41:13.799652 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35309d31-c095-492f-8645-f99a629dafd5","Type":"ContainerStarted","Data":"9829a431d1fd723a7cc6651b150f24e011c179b882d2c99bc0bed6aa57f81823"} Feb 19 19:41:13 crc kubenswrapper[4722]: I0219 19:41:13.800027 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 19:41:13 crc kubenswrapper[4722]: I0219 19:41:13.802861 4722 generic.go:334] "Generic (PLEG): container finished" podID="5a7bcc56-4611-489d-8f1b-2105503393de" containerID="85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772" exitCode=0 Feb 19 19:41:13 crc kubenswrapper[4722]: I0219 19:41:13.802924 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vsgx" event={"ID":"5a7bcc56-4611-489d-8f1b-2105503393de","Type":"ContainerDied","Data":"85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772"} Feb 19 19:41:13 crc kubenswrapper[4722]: I0219 19:41:13.802967 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vsgx" event={"ID":"5a7bcc56-4611-489d-8f1b-2105503393de","Type":"ContainerStarted","Data":"9b0e003e8db1c8ad9f32db1f4c8f753e186cbb9352d3fba619af370f91abf338"} Feb 19 19:41:13 crc kubenswrapper[4722]: I0219 19:41:13.824103 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.071988225 podStartE2EDuration="6.824085277s" podCreationTimestamp="2026-02-19 19:41:07 +0000 UTC" firstStartedPulling="2026-02-19 19:41:08.794082341 +0000 UTC m=+1368.406432655" lastFinishedPulling="2026-02-19 19:41:12.546179373 +0000 UTC m=+1372.158529707" observedRunningTime="2026-02-19 19:41:13.823660374 +0000 UTC m=+1373.436010698" watchObservedRunningTime="2026-02-19 19:41:13.824085277 +0000 UTC m=+1373.436435601" Feb 19 19:41:14 crc kubenswrapper[4722]: I0219 19:41:14.131634 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 19:41:14 crc kubenswrapper[4722]: I0219 19:41:14.131698 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 19:41:14 crc kubenswrapper[4722]: I0219 19:41:14.846581 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vsgx" event={"ID":"5a7bcc56-4611-489d-8f1b-2105503393de","Type":"ContainerStarted","Data":"2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6"} Feb 19 19:41:15 crc kubenswrapper[4722]: I0219 19:41:15.082892 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 19:41:15 crc kubenswrapper[4722]: I0219 19:41:15.110280 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 19:41:15 crc kubenswrapper[4722]: I0219 19:41:15.132830 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 19:41:15 crc kubenswrapper[4722]: I0219 19:41:15.140322 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 19:41:15 crc kubenswrapper[4722]: I0219 19:41:15.140404 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 19:41:15 crc kubenswrapper[4722]: I0219 19:41:15.884659 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 19:41:19 crc kubenswrapper[4722]: I0219 19:41:19.201885 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 19:41:19 crc kubenswrapper[4722]: I0219 19:41:19.202499 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 19:41:19 crc kubenswrapper[4722]: I0219 19:41:19.894074 4722 generic.go:334] "Generic (PLEG): container finished" podID="5a7bcc56-4611-489d-8f1b-2105503393de" containerID="2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6" exitCode=0 Feb 19 19:41:19 crc kubenswrapper[4722]: I0219 19:41:19.894207 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vsgx" event={"ID":"5a7bcc56-4611-489d-8f1b-2105503393de","Type":"ContainerDied","Data":"2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6"} Feb 19 19:41:20 crc kubenswrapper[4722]: I0219 19:41:20.285519 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.222:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 19:41:20 crc kubenswrapper[4722]: I0219 19:41:20.285545 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.222:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 19:41:21 crc kubenswrapper[4722]: I0219 19:41:21.916074 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vsgx" event={"ID":"5a7bcc56-4611-489d-8f1b-2105503393de","Type":"ContainerStarted","Data":"99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0"} Feb 19 19:41:21 crc kubenswrapper[4722]: I0219 19:41:21.944067 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2vsgx" podStartSLOduration=4.403396707 podStartE2EDuration="10.944048657s" podCreationTimestamp="2026-02-19 19:41:11 +0000 UTC" firstStartedPulling="2026-02-19 19:41:13.804969909 +0000 UTC m=+1373.417320233" lastFinishedPulling="2026-02-19 19:41:20.345621819 +0000 UTC m=+1379.957972183" observedRunningTime="2026-02-19 19:41:21.937256135 +0000 UTC m=+1381.549606469" watchObservedRunningTime="2026-02-19 19:41:21.944048657 +0000 UTC m=+1381.556398981" Feb 19 19:41:22 crc kubenswrapper[4722]: I0219 19:41:22.342841 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:22 crc kubenswrapper[4722]: I0219 19:41:22.342892 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:23 crc kubenswrapper[4722]: I0219 19:41:23.393856 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2vsgx" podUID="5a7bcc56-4611-489d-8f1b-2105503393de" containerName="registry-server" probeResult="failure" output=< Feb 19 19:41:23 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 19 19:41:23 crc kubenswrapper[4722]: > Feb 19 19:41:24 crc kubenswrapper[4722]: I0219 19:41:24.137266 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 19:41:24 crc kubenswrapper[4722]: I0219 19:41:24.140890 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 19:41:24 crc kubenswrapper[4722]: I0219 19:41:24.143743 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 19:41:24 crc kubenswrapper[4722]: I0219 19:41:24.958242 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 19:41:26 crc kubenswrapper[4722]: I0219 19:41:26.977771 4722 generic.go:334] "Generic (PLEG): container finished" podID="b663fb91-fb60-451c-a9c9-7278dbd1c9ac" containerID="346ae374bf887f315658e5888cdaaef27ec7de0b0320851ac3b6d0f93d5058e0" exitCode=137 Feb 19 19:41:26 crc kubenswrapper[4722]: I0219 19:41:26.979875 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b663fb91-fb60-451c-a9c9-7278dbd1c9ac","Type":"ContainerDied","Data":"346ae374bf887f315658e5888cdaaef27ec7de0b0320851ac3b6d0f93d5058e0"} Feb 19 19:41:26 crc kubenswrapper[4722]: I0219 19:41:26.979938 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b663fb91-fb60-451c-a9c9-7278dbd1c9ac","Type":"ContainerDied","Data":"f4eb8507bf9bd4ba56ef8e8c9e8c4b4aef6c178119ed37272deca3c919973d29"} Feb 19 19:41:26 crc kubenswrapper[4722]: I0219 19:41:26.979953 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4eb8507bf9bd4ba56ef8e8c9e8c4b4aef6c178119ed37272deca3c919973d29" Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.047702 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.145650 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-combined-ca-bundle\") pod \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.145731 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-config-data\") pod \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.146536 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42f98\" (UniqueName: \"kubernetes.io/projected/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-kube-api-access-42f98\") pod \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\" (UID: \"b663fb91-fb60-451c-a9c9-7278dbd1c9ac\") " Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.153689 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-kube-api-access-42f98" (OuterVolumeSpecName: "kube-api-access-42f98") pod "b663fb91-fb60-451c-a9c9-7278dbd1c9ac" (UID: "b663fb91-fb60-451c-a9c9-7278dbd1c9ac"). InnerVolumeSpecName "kube-api-access-42f98". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.180327 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-config-data" (OuterVolumeSpecName: "config-data") pod "b663fb91-fb60-451c-a9c9-7278dbd1c9ac" (UID: "b663fb91-fb60-451c-a9c9-7278dbd1c9ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.191253 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b663fb91-fb60-451c-a9c9-7278dbd1c9ac" (UID: "b663fb91-fb60-451c-a9c9-7278dbd1c9ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.248681 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.248712 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.248722 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42f98\" (UniqueName: \"kubernetes.io/projected/b663fb91-fb60-451c-a9c9-7278dbd1c9ac-kube-api-access-42f98\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:27 crc kubenswrapper[4722]: I0219 19:41:27.988526 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.024133 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.036324 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.049262 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 19:41:28 crc kubenswrapper[4722]: E0219 19:41:28.049946 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b663fb91-fb60-451c-a9c9-7278dbd1c9ac" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.049972 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b663fb91-fb60-451c-a9c9-7278dbd1c9ac" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.050235 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b663fb91-fb60-451c-a9c9-7278dbd1c9ac" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.051119 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.055994 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.056272 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.059823 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.079530 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.169281 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2vln\" (UniqueName: \"kubernetes.io/projected/168eaa46-c907-452a-8537-3cea6b524360-kube-api-access-h2vln\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.169884 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.169939 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.170173 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.170202 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.272352 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.272399 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.272484 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.272504 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.272577 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2vln\" (UniqueName: \"kubernetes.io/projected/168eaa46-c907-452a-8537-3cea6b524360-kube-api-access-h2vln\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.277674 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.277690 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.283669 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.284591 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/168eaa46-c907-452a-8537-3cea6b524360-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.293227 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2vln\" (UniqueName: \"kubernetes.io/projected/168eaa46-c907-452a-8537-3cea6b524360-kube-api-access-h2vln\") pod \"nova-cell1-novncproxy-0\" (UID: \"168eaa46-c907-452a-8537-3cea6b524360\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.380144 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:28 crc kubenswrapper[4722]: I0219 19:41:28.870009 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 19:41:28 crc kubenswrapper[4722]: W0219 19:41:28.891277 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod168eaa46_c907_452a_8537_3cea6b524360.slice/crio-87fd18953552c0fe7061a67b66886b68a74522aba8c3051a5102079a5120a20f WatchSource:0}: Error finding container 87fd18953552c0fe7061a67b66886b68a74522aba8c3051a5102079a5120a20f: Status 404 returned error can't find the container with id 87fd18953552c0fe7061a67b66886b68a74522aba8c3051a5102079a5120a20f Feb 19 19:41:29 crc kubenswrapper[4722]: I0219 19:41:29.003776 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"168eaa46-c907-452a-8537-3cea6b524360","Type":"ContainerStarted","Data":"87fd18953552c0fe7061a67b66886b68a74522aba8c3051a5102079a5120a20f"} Feb 19 19:41:29 crc kubenswrapper[4722]: I0219 19:41:29.091102 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b663fb91-fb60-451c-a9c9-7278dbd1c9ac" path="/var/lib/kubelet/pods/b663fb91-fb60-451c-a9c9-7278dbd1c9ac/volumes" Feb 19 19:41:29 crc kubenswrapper[4722]: I0219 19:41:29.206809 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 19:41:29 crc kubenswrapper[4722]: I0219 19:41:29.207580 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 19:41:29 crc kubenswrapper[4722]: I0219 19:41:29.207708 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 19:41:29 crc kubenswrapper[4722]: I0219 19:41:29.213906 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.016098 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"168eaa46-c907-452a-8537-3cea6b524360","Type":"ContainerStarted","Data":"9340639e2e9665570ab0da301dc3a40575632326238c2a65a98c80cedd64dc02"} Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.017459 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.022315 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.063588 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.063565383 podStartE2EDuration="2.063565383s" podCreationTimestamp="2026-02-19 19:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:30.036540199 +0000 UTC m=+1389.648890523" watchObservedRunningTime="2026-02-19 19:41:30.063565383 +0000 UTC m=+1389.675915727" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.271780 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-f59d8"] Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.279196 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.290070 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-f59d8"] Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.321617 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.321842 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.322268 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-config\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.322371 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lfrj\" (UniqueName: \"kubernetes.io/projected/dfcca6fc-5afb-464c-9852-3532ba5878a3-kube-api-access-2lfrj\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.322540 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.322784 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.427910 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.427974 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.428062 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.428108 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.428143 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-config\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.428184 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lfrj\" (UniqueName: \"kubernetes.io/projected/dfcca6fc-5afb-464c-9852-3532ba5878a3-kube-api-access-2lfrj\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.429480 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.429528 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.429816 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.429953 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-config\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.429953 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.454478 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lfrj\" (UniqueName: \"kubernetes.io/projected/dfcca6fc-5afb-464c-9852-3532ba5878a3-kube-api-access-2lfrj\") pod \"dnsmasq-dns-5fd9b586ff-f59d8\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:30 crc kubenswrapper[4722]: I0219 19:41:30.607779 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:31 crc kubenswrapper[4722]: I0219 19:41:31.152226 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-f59d8"] Feb 19 19:41:31 crc kubenswrapper[4722]: W0219 19:41:31.166318 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfcca6fc_5afb_464c_9852_3532ba5878a3.slice/crio-a7846c5aa72760b5fbf2419a5198a4a23f44068dc0e3a98cd281007d3f37f7b4 WatchSource:0}: Error finding container a7846c5aa72760b5fbf2419a5198a4a23f44068dc0e3a98cd281007d3f37f7b4: Status 404 returned error can't find the container with id a7846c5aa72760b5fbf2419a5198a4a23f44068dc0e3a98cd281007d3f37f7b4 Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.049263 4722 generic.go:334] "Generic (PLEG): container finished" podID="dfcca6fc-5afb-464c-9852-3532ba5878a3" containerID="2645caf8bc3502647b4c5a4dc4d97510df5ceb77697881dbc41661d5cae80579" exitCode=0 Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.049304 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" event={"ID":"dfcca6fc-5afb-464c-9852-3532ba5878a3","Type":"ContainerDied","Data":"2645caf8bc3502647b4c5a4dc4d97510df5ceb77697881dbc41661d5cae80579"} Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.049715 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" event={"ID":"dfcca6fc-5afb-464c-9852-3532ba5878a3","Type":"ContainerStarted","Data":"a7846c5aa72760b5fbf2419a5198a4a23f44068dc0e3a98cd281007d3f37f7b4"} Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.395101 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.463953 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.644531 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2vsgx"] Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.699858 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.700251 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="ceilometer-central-agent" containerID="cri-o://f9dabf9de0a02c3036feae3654812d6b4b50e934a33a2041e4126a20558ce346" gracePeriod=30 Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.700478 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="ceilometer-notification-agent" containerID="cri-o://797f3dbcc096d3c7dab6d3e2bf6dbfd40f7157e1f303fed3bdb44a3590ac7443" gracePeriod=30 Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.700671 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="proxy-httpd" containerID="cri-o://9829a431d1fd723a7cc6651b150f24e011c179b882d2c99bc0bed6aa57f81823" gracePeriod=30 Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.700830 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="sg-core" containerID="cri-o://aad5d7ab0f17d255e8f33e2fc5558128806cd86a1f59e6e4f901792e4723d331" gracePeriod=30 Feb 19 19:41:32 crc kubenswrapper[4722]: I0219 19:41:32.725192 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.221:3000/\": EOF" Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.050196 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.062188 4722 generic.go:334] "Generic (PLEG): container finished" podID="35309d31-c095-492f-8645-f99a629dafd5" containerID="9829a431d1fd723a7cc6651b150f24e011c179b882d2c99bc0bed6aa57f81823" exitCode=0 Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.062223 4722 generic.go:334] "Generic (PLEG): container finished" podID="35309d31-c095-492f-8645-f99a629dafd5" containerID="aad5d7ab0f17d255e8f33e2fc5558128806cd86a1f59e6e4f901792e4723d331" exitCode=2 Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.062233 4722 generic.go:334] "Generic (PLEG): container finished" podID="35309d31-c095-492f-8645-f99a629dafd5" containerID="f9dabf9de0a02c3036feae3654812d6b4b50e934a33a2041e4126a20558ce346" exitCode=0 Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.062279 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35309d31-c095-492f-8645-f99a629dafd5","Type":"ContainerDied","Data":"9829a431d1fd723a7cc6651b150f24e011c179b882d2c99bc0bed6aa57f81823"} Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.062310 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35309d31-c095-492f-8645-f99a629dafd5","Type":"ContainerDied","Data":"aad5d7ab0f17d255e8f33e2fc5558128806cd86a1f59e6e4f901792e4723d331"} Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.062325 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35309d31-c095-492f-8645-f99a629dafd5","Type":"ContainerDied","Data":"f9dabf9de0a02c3036feae3654812d6b4b50e934a33a2041e4126a20558ce346"} Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.064538 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" event={"ID":"dfcca6fc-5afb-464c-9852-3532ba5878a3","Type":"ContainerStarted","Data":"07ae856c61611ad79b54a655cdc3c7aa79d812aa79705666cb7de6834474fefb"} Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.064734 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerName="nova-api-log" containerID="cri-o://3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68" gracePeriod=30 Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.064770 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerName="nova-api-api" containerID="cri-o://8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea" gracePeriod=30 Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.103118 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" podStartSLOduration=3.103101773 podStartE2EDuration="3.103101773s" podCreationTimestamp="2026-02-19 19:41:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:33.100220863 +0000 UTC m=+1392.712571197" watchObservedRunningTime="2026-02-19 19:41:33.103101773 +0000 UTC m=+1392.715452097" Feb 19 19:41:33 crc kubenswrapper[4722]: I0219 19:41:33.380287 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.079852 4722 generic.go:334] "Generic (PLEG): container finished" podID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerID="3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68" exitCode=143 Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.079939 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96772512-a8ae-42f5-b8ce-748d1115c4ef","Type":"ContainerDied","Data":"3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68"} Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.080269 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2vsgx" podUID="5a7bcc56-4611-489d-8f1b-2105503393de" containerName="registry-server" containerID="cri-o://99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0" gracePeriod=2 Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.080511 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.630170 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.738177 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-utilities\") pod \"5a7bcc56-4611-489d-8f1b-2105503393de\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.738323 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rrkx\" (UniqueName: \"kubernetes.io/projected/5a7bcc56-4611-489d-8f1b-2105503393de-kube-api-access-8rrkx\") pod \"5a7bcc56-4611-489d-8f1b-2105503393de\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.738385 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-catalog-content\") pod \"5a7bcc56-4611-489d-8f1b-2105503393de\" (UID: \"5a7bcc56-4611-489d-8f1b-2105503393de\") " Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.744234 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a7bcc56-4611-489d-8f1b-2105503393de-kube-api-access-8rrkx" (OuterVolumeSpecName: "kube-api-access-8rrkx") pod "5a7bcc56-4611-489d-8f1b-2105503393de" (UID: "5a7bcc56-4611-489d-8f1b-2105503393de"). InnerVolumeSpecName "kube-api-access-8rrkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.758639 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-utilities" (OuterVolumeSpecName: "utilities") pod "5a7bcc56-4611-489d-8f1b-2105503393de" (UID: "5a7bcc56-4611-489d-8f1b-2105503393de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.840279 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.840311 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rrkx\" (UniqueName: \"kubernetes.io/projected/5a7bcc56-4611-489d-8f1b-2105503393de-kube-api-access-8rrkx\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.880432 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a7bcc56-4611-489d-8f1b-2105503393de" (UID: "5a7bcc56-4611-489d-8f1b-2105503393de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:34 crc kubenswrapper[4722]: I0219 19:41:34.942723 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a7bcc56-4611-489d-8f1b-2105503393de-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.094965 4722 generic.go:334] "Generic (PLEG): container finished" podID="5a7bcc56-4611-489d-8f1b-2105503393de" containerID="99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0" exitCode=0 Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.095081 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2vsgx" Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.095098 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vsgx" event={"ID":"5a7bcc56-4611-489d-8f1b-2105503393de","Type":"ContainerDied","Data":"99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0"} Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.096198 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2vsgx" event={"ID":"5a7bcc56-4611-489d-8f1b-2105503393de","Type":"ContainerDied","Data":"9b0e003e8db1c8ad9f32db1f4c8f753e186cbb9352d3fba619af370f91abf338"} Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.096243 4722 scope.go:117] "RemoveContainer" containerID="99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0" Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.132747 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2vsgx"] Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.138368 4722 scope.go:117] "RemoveContainer" containerID="2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6" Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.150406 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2vsgx"] Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.173277 4722 scope.go:117] "RemoveContainer" containerID="85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772" Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.230877 4722 scope.go:117] "RemoveContainer" containerID="99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0" Feb 19 19:41:35 crc kubenswrapper[4722]: E0219 19:41:35.236561 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0\": container with ID starting with 99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0 not found: ID does not exist" containerID="99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0" Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.236662 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0"} err="failed to get container status \"99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0\": rpc error: code = NotFound desc = could not find container \"99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0\": container with ID starting with 99451d8104692a872658ddc3eb551a93e6076b714f2196db292a0b0d28dd0db0 not found: ID does not exist" Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.236762 4722 scope.go:117] "RemoveContainer" containerID="2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6" Feb 19 19:41:35 crc kubenswrapper[4722]: E0219 19:41:35.239561 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6\": container with ID starting with 2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6 not found: ID does not exist" containerID="2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6" Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.239587 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6"} err="failed to get container status \"2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6\": rpc error: code = NotFound desc = could not find container \"2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6\": container with ID starting with 2eadc2489ba077f2e9fe1f7e2b89d49ae2f07b859b940d57dfd3765c2accf1b6 not found: ID does not exist" Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.239606 4722 scope.go:117] "RemoveContainer" containerID="85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772" Feb 19 19:41:35 crc kubenswrapper[4722]: E0219 19:41:35.242433 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772\": container with ID starting with 85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772 not found: ID does not exist" containerID="85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772" Feb 19 19:41:35 crc kubenswrapper[4722]: I0219 19:41:35.242458 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772"} err="failed to get container status \"85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772\": rpc error: code = NotFound desc = could not find container \"85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772\": container with ID starting with 85709083e162ef7ae795ed74653a635dfb9cce4733622273469f417b2e570772 not found: ID does not exist" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.750614 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.883830 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78gnk\" (UniqueName: \"kubernetes.io/projected/96772512-a8ae-42f5-b8ce-748d1115c4ef-kube-api-access-78gnk\") pod \"96772512-a8ae-42f5-b8ce-748d1115c4ef\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.883925 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-combined-ca-bundle\") pod \"96772512-a8ae-42f5-b8ce-748d1115c4ef\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.884007 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-config-data\") pod \"96772512-a8ae-42f5-b8ce-748d1115c4ef\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.884071 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96772512-a8ae-42f5-b8ce-748d1115c4ef-logs\") pod \"96772512-a8ae-42f5-b8ce-748d1115c4ef\" (UID: \"96772512-a8ae-42f5-b8ce-748d1115c4ef\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.884692 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96772512-a8ae-42f5-b8ce-748d1115c4ef-logs" (OuterVolumeSpecName: "logs") pod "96772512-a8ae-42f5-b8ce-748d1115c4ef" (UID: "96772512-a8ae-42f5-b8ce-748d1115c4ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.893552 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96772512-a8ae-42f5-b8ce-748d1115c4ef-kube-api-access-78gnk" (OuterVolumeSpecName: "kube-api-access-78gnk") pod "96772512-a8ae-42f5-b8ce-748d1115c4ef" (UID: "96772512-a8ae-42f5-b8ce-748d1115c4ef"). InnerVolumeSpecName "kube-api-access-78gnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.952441 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96772512-a8ae-42f5-b8ce-748d1115c4ef" (UID: "96772512-a8ae-42f5-b8ce-748d1115c4ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.965700 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-config-data" (OuterVolumeSpecName: "config-data") pod "96772512-a8ae-42f5-b8ce-748d1115c4ef" (UID: "96772512-a8ae-42f5-b8ce-748d1115c4ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.988840 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78gnk\" (UniqueName: \"kubernetes.io/projected/96772512-a8ae-42f5-b8ce-748d1115c4ef-kube-api-access-78gnk\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.988867 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.988878 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96772512-a8ae-42f5-b8ce-748d1115c4ef-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:36.988888 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96772512-a8ae-42f5-b8ce-748d1115c4ef-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:37 crc kubenswrapper[4722]: E0219 19:41:37.083189 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35309d31_c095_492f_8645_f99a629dafd5.slice/crio-797f3dbcc096d3c7dab6d3e2bf6dbfd40f7157e1f303fed3bdb44a3590ac7443.scope\": RecentStats: unable to find data in memory cache]" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.138733 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a7bcc56-4611-489d-8f1b-2105503393de" path="/var/lib/kubelet/pods/5a7bcc56-4611-489d-8f1b-2105503393de/volumes" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.240425 4722 generic.go:334] "Generic (PLEG): container finished" podID="35309d31-c095-492f-8645-f99a629dafd5" containerID="797f3dbcc096d3c7dab6d3e2bf6dbfd40f7157e1f303fed3bdb44a3590ac7443" exitCode=0 Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.240521 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35309d31-c095-492f-8645-f99a629dafd5","Type":"ContainerDied","Data":"797f3dbcc096d3c7dab6d3e2bf6dbfd40f7157e1f303fed3bdb44a3590ac7443"} Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.272493 4722 generic.go:334] "Generic (PLEG): container finished" podID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerID="8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea" exitCode=0 Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.272547 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96772512-a8ae-42f5-b8ce-748d1115c4ef","Type":"ContainerDied","Data":"8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea"} Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.272580 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"96772512-a8ae-42f5-b8ce-748d1115c4ef","Type":"ContainerDied","Data":"0fe3f455ace2c4dc1bef255cba16bf417ac0ad0b75a12854a770b0ccf14ad7e7"} Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.272601 4722 scope.go:117] "RemoveContainer" containerID="8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.272834 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.366749 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.393235 4722 scope.go:117] "RemoveContainer" containerID="3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.397411 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.417696 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:37 crc kubenswrapper[4722]: E0219 19:41:37.418255 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerName="nova-api-log" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.418276 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerName="nova-api-log" Feb 19 19:41:37 crc kubenswrapper[4722]: E0219 19:41:37.418301 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a7bcc56-4611-489d-8f1b-2105503393de" containerName="registry-server" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.418310 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a7bcc56-4611-489d-8f1b-2105503393de" containerName="registry-server" Feb 19 19:41:37 crc kubenswrapper[4722]: E0219 19:41:37.431572 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a7bcc56-4611-489d-8f1b-2105503393de" containerName="extract-utilities" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.431618 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a7bcc56-4611-489d-8f1b-2105503393de" containerName="extract-utilities" Feb 19 19:41:37 crc kubenswrapper[4722]: E0219 19:41:37.431662 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerName="nova-api-api" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.431671 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerName="nova-api-api" Feb 19 19:41:37 crc kubenswrapper[4722]: E0219 19:41:37.431684 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a7bcc56-4611-489d-8f1b-2105503393de" containerName="extract-content" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.431690 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a7bcc56-4611-489d-8f1b-2105503393de" containerName="extract-content" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.432064 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerName="nova-api-api" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.432085 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a7bcc56-4611-489d-8f1b-2105503393de" containerName="registry-server" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.432106 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" containerName="nova-api-log" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.436327 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.436409 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.443047 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.443417 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.443616 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.443805 4722 scope.go:117] "RemoveContainer" containerID="8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea" Feb 19 19:41:37 crc kubenswrapper[4722]: E0219 19:41:37.444546 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea\": container with ID starting with 8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea not found: ID does not exist" containerID="8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.444586 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea"} err="failed to get container status \"8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea\": rpc error: code = NotFound desc = could not find container \"8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea\": container with ID starting with 8a10d95d1d32406ceefe6ec1feb12fa0927bbc00500e88f49403340b270a16ea not found: ID does not exist" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.444618 4722 scope.go:117] "RemoveContainer" containerID="3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68" Feb 19 19:41:37 crc kubenswrapper[4722]: E0219 19:41:37.446356 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68\": container with ID starting with 3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68 not found: ID does not exist" containerID="3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.446382 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68"} err="failed to get container status \"3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68\": rpc error: code = NotFound desc = could not find container \"3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68\": container with ID starting with 3a6316d0c173bf8e09715c1e68aa618b966abe2e522b55af7e34d1f85003bf68 not found: ID does not exist" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.500996 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v5gc\" (UniqueName: \"kubernetes.io/projected/7cd6b11a-72fb-4116-8ccf-aee449ab564a-kube-api-access-8v5gc\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.501058 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.501092 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.501139 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd6b11a-72fb-4116-8ccf-aee449ab564a-logs\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.501257 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-config-data\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.501313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-public-tls-certs\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.602852 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-config-data\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.603145 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-public-tls-certs\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.604542 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v5gc\" (UniqueName: \"kubernetes.io/projected/7cd6b11a-72fb-4116-8ccf-aee449ab564a-kube-api-access-8v5gc\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.605286 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.605351 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.605419 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd6b11a-72fb-4116-8ccf-aee449ab564a-logs\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.605969 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd6b11a-72fb-4116-8ccf-aee449ab564a-logs\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.613882 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.614796 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.617284 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-config-data\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.617421 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-public-tls-certs\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.621723 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v5gc\" (UniqueName: \"kubernetes.io/projected/7cd6b11a-72fb-4116-8ccf-aee449ab564a-kube-api-access-8v5gc\") pod \"nova-api-0\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.701743 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.754867 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.809641 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-log-httpd\") pod \"35309d31-c095-492f-8645-f99a629dafd5\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.809946 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "35309d31-c095-492f-8645-f99a629dafd5" (UID: "35309d31-c095-492f-8645-f99a629dafd5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.810102 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-ceilometer-tls-certs\") pod \"35309d31-c095-492f-8645-f99a629dafd5\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.810257 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-combined-ca-bundle\") pod \"35309d31-c095-492f-8645-f99a629dafd5\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.810402 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfrfq\" (UniqueName: \"kubernetes.io/projected/35309d31-c095-492f-8645-f99a629dafd5-kube-api-access-vfrfq\") pod \"35309d31-c095-492f-8645-f99a629dafd5\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.810490 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-sg-core-conf-yaml\") pod \"35309d31-c095-492f-8645-f99a629dafd5\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.810543 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-scripts\") pod \"35309d31-c095-492f-8645-f99a629dafd5\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.810615 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-config-data\") pod \"35309d31-c095-492f-8645-f99a629dafd5\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.810658 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-run-httpd\") pod \"35309d31-c095-492f-8645-f99a629dafd5\" (UID: \"35309d31-c095-492f-8645-f99a629dafd5\") " Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.811447 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "35309d31-c095-492f-8645-f99a629dafd5" (UID: "35309d31-c095-492f-8645-f99a629dafd5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.811571 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.814321 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-scripts" (OuterVolumeSpecName: "scripts") pod "35309d31-c095-492f-8645-f99a629dafd5" (UID: "35309d31-c095-492f-8645-f99a629dafd5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.815394 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35309d31-c095-492f-8645-f99a629dafd5-kube-api-access-vfrfq" (OuterVolumeSpecName: "kube-api-access-vfrfq") pod "35309d31-c095-492f-8645-f99a629dafd5" (UID: "35309d31-c095-492f-8645-f99a629dafd5"). InnerVolumeSpecName "kube-api-access-vfrfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.845452 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "35309d31-c095-492f-8645-f99a629dafd5" (UID: "35309d31-c095-492f-8645-f99a629dafd5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.887625 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "35309d31-c095-492f-8645-f99a629dafd5" (UID: "35309d31-c095-492f-8645-f99a629dafd5"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.930100 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.930128 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfrfq\" (UniqueName: \"kubernetes.io/projected/35309d31-c095-492f-8645-f99a629dafd5-kube-api-access-vfrfq\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.930232 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.930269 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.930279 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35309d31-c095-492f-8645-f99a629dafd5-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.937374 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35309d31-c095-492f-8645-f99a629dafd5" (UID: "35309d31-c095-492f-8645-f99a629dafd5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:37 crc kubenswrapper[4722]: I0219 19:41:37.958419 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-config-data" (OuterVolumeSpecName: "config-data") pod "35309d31-c095-492f-8645-f99a629dafd5" (UID: "35309d31-c095-492f-8645-f99a629dafd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.032243 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.032277 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35309d31-c095-492f-8645-f99a629dafd5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.287808 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35309d31-c095-492f-8645-f99a629dafd5","Type":"ContainerDied","Data":"c75a051258432020e538a8b547aaf892bef9988a1634efdd8324c9ff11bd5121"} Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.287871 4722 scope.go:117] "RemoveContainer" containerID="9829a431d1fd723a7cc6651b150f24e011c179b882d2c99bc0bed6aa57f81823" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.287921 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.311113 4722 scope.go:117] "RemoveContainer" containerID="aad5d7ab0f17d255e8f33e2fc5558128806cd86a1f59e6e4f901792e4723d331" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.330524 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.380918 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.406200 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.470532 4722 scope.go:117] "RemoveContainer" containerID="797f3dbcc096d3c7dab6d3e2bf6dbfd40f7157e1f303fed3bdb44a3590ac7443" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.523937 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.546290 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.556825 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:38 crc kubenswrapper[4722]: E0219 19:41:38.557348 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="sg-core" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.557369 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="sg-core" Feb 19 19:41:38 crc kubenswrapper[4722]: E0219 19:41:38.557389 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="proxy-httpd" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.557397 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="proxy-httpd" Feb 19 19:41:38 crc kubenswrapper[4722]: E0219 19:41:38.557411 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="ceilometer-notification-agent" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.557421 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="ceilometer-notification-agent" Feb 19 19:41:38 crc kubenswrapper[4722]: E0219 19:41:38.557445 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="ceilometer-central-agent" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.557454 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="ceilometer-central-agent" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.557697 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="proxy-httpd" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.557713 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="sg-core" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.557741 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="ceilometer-central-agent" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.557760 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="35309d31-c095-492f-8645-f99a629dafd5" containerName="ceilometer-notification-agent" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.560942 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.563466 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.566593 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.566843 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.570762 4722 scope.go:117] "RemoveContainer" containerID="f9dabf9de0a02c3036feae3654812d6b4b50e934a33a2041e4126a20558ce346" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.571909 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.645603 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.645676 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-log-httpd\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.645705 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-scripts\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.645722 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.645794 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.645810 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-run-httpd\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.645907 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz5fq\" (UniqueName: \"kubernetes.io/projected/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-kube-api-access-sz5fq\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.645945 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-config-data\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.748930 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz5fq\" (UniqueName: \"kubernetes.io/projected/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-kube-api-access-sz5fq\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.748978 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-config-data\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.749088 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.749232 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-log-httpd\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.749290 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-scripts\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.749313 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.749512 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.749546 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-run-httpd\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.750114 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-run-httpd\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.750353 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-log-httpd\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.754332 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-config-data\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.771144 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.771411 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-scripts\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.771566 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.775008 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.779738 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz5fq\" (UniqueName: \"kubernetes.io/projected/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-kube-api-access-sz5fq\") pod \"ceilometer-0\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " pod="openstack/ceilometer-0" Feb 19 19:41:38 crc kubenswrapper[4722]: I0219 19:41:38.885938 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.081170 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35309d31-c095-492f-8645-f99a629dafd5" path="/var/lib/kubelet/pods/35309d31-c095-492f-8645-f99a629dafd5/volumes" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.082242 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96772512-a8ae-42f5-b8ce-748d1115c4ef" path="/var/lib/kubelet/pods/96772512-a8ae-42f5-b8ce-748d1115c4ef/volumes" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.300243 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cd6b11a-72fb-4116-8ccf-aee449ab564a","Type":"ContainerStarted","Data":"8b7451defcdb3c6dd2e97b1753af390c2ae8fee12578c7f793efbbbccdf699e8"} Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.300521 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cd6b11a-72fb-4116-8ccf-aee449ab564a","Type":"ContainerStarted","Data":"7dfeaedcfdea401372ee480bf2951f7e18eed6383fb5d16dc006019bff38c3a8"} Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.300533 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cd6b11a-72fb-4116-8ccf-aee449ab564a","Type":"ContainerStarted","Data":"169addbbb50b70578d635a167ff6f9ab09ad19713f22931dee2d5c96156a2632"} Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.322295 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.324212 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.32419915 podStartE2EDuration="2.32419915s" podCreationTimestamp="2026-02-19 19:41:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:39.322544449 +0000 UTC m=+1398.934894773" watchObservedRunningTime="2026-02-19 19:41:39.32419915 +0000 UTC m=+1398.936549494" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.440268 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:41:39 crc kubenswrapper[4722]: W0219 19:41:39.441844 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe6ac8e6_55da_4c9f_b3b3_fc60afc50c37.slice/crio-70a6581ff6c48dbe85b738501a440479409b7191194cf27a145b098c06b0a532 WatchSource:0}: Error finding container 70a6581ff6c48dbe85b738501a440479409b7191194cf27a145b098c06b0a532: Status 404 returned error can't find the container with id 70a6581ff6c48dbe85b738501a440479409b7191194cf27a145b098c06b0a532 Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.489611 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-2lhsl"] Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.492246 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.496020 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.496965 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.504605 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2lhsl"] Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.565367 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.565407 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-scripts\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.565587 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-config-data\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.565851 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cqv2\" (UniqueName: \"kubernetes.io/projected/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-kube-api-access-4cqv2\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.668361 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.668410 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-scripts\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.668491 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-config-data\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.668592 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cqv2\" (UniqueName: \"kubernetes.io/projected/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-kube-api-access-4cqv2\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.675829 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-scripts\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.676084 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-config-data\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.676333 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.684891 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cqv2\" (UniqueName: \"kubernetes.io/projected/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-kube-api-access-4cqv2\") pod \"nova-cell1-cell-mapping-2lhsl\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:39 crc kubenswrapper[4722]: I0219 19:41:39.816862 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:40 crc kubenswrapper[4722]: I0219 19:41:40.290241 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2lhsl"] Feb 19 19:41:40 crc kubenswrapper[4722]: I0219 19:41:40.328354 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2lhsl" event={"ID":"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8","Type":"ContainerStarted","Data":"a029982f23c93dbc7ae7b59f97f09872491299ab2ae9f8e1237d81964fadfc32"} Feb 19 19:41:40 crc kubenswrapper[4722]: I0219 19:41:40.387473 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37","Type":"ContainerStarted","Data":"28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679"} Feb 19 19:41:40 crc kubenswrapper[4722]: I0219 19:41:40.387518 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37","Type":"ContainerStarted","Data":"70a6581ff6c48dbe85b738501a440479409b7191194cf27a145b098c06b0a532"} Feb 19 19:41:40 crc kubenswrapper[4722]: I0219 19:41:40.610339 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:41:40 crc kubenswrapper[4722]: I0219 19:41:40.688953 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-5cmk8"] Feb 19 19:41:40 crc kubenswrapper[4722]: I0219 19:41:40.689890 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" podUID="5e629ce1-0108-4450-bb62-44ca1d2993b6" containerName="dnsmasq-dns" containerID="cri-o://42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e" gracePeriod=10 Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.276319 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.409498 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-sb\") pod \"5e629ce1-0108-4450-bb62-44ca1d2993b6\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.409564 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dvwm\" (UniqueName: \"kubernetes.io/projected/5e629ce1-0108-4450-bb62-44ca1d2993b6-kube-api-access-5dvwm\") pod \"5e629ce1-0108-4450-bb62-44ca1d2993b6\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.409594 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-nb\") pod \"5e629ce1-0108-4450-bb62-44ca1d2993b6\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.409677 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-swift-storage-0\") pod \"5e629ce1-0108-4450-bb62-44ca1d2993b6\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.409765 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-config\") pod \"5e629ce1-0108-4450-bb62-44ca1d2993b6\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.409802 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-svc\") pod \"5e629ce1-0108-4450-bb62-44ca1d2993b6\" (UID: \"5e629ce1-0108-4450-bb62-44ca1d2993b6\") " Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.416612 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37","Type":"ContainerStarted","Data":"cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d"} Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.418127 4722 generic.go:334] "Generic (PLEG): container finished" podID="5e629ce1-0108-4450-bb62-44ca1d2993b6" containerID="42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e" exitCode=0 Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.418195 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" event={"ID":"5e629ce1-0108-4450-bb62-44ca1d2993b6","Type":"ContainerDied","Data":"42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e"} Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.418212 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" event={"ID":"5e629ce1-0108-4450-bb62-44ca1d2993b6","Type":"ContainerDied","Data":"665dc9cdb3191f51c3a767aeef3d279162c1c7c6e48e1f117eb053fc7bdf1b06"} Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.418228 4722 scope.go:117] "RemoveContainer" containerID="42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.418361 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-5cmk8" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.423345 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e629ce1-0108-4450-bb62-44ca1d2993b6-kube-api-access-5dvwm" (OuterVolumeSpecName: "kube-api-access-5dvwm") pod "5e629ce1-0108-4450-bb62-44ca1d2993b6" (UID: "5e629ce1-0108-4450-bb62-44ca1d2993b6"). InnerVolumeSpecName "kube-api-access-5dvwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.423680 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2lhsl" event={"ID":"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8","Type":"ContainerStarted","Data":"1b22b481f4ab7fb0f4e181aec8382fbfd29168cb889f08d4a7d81841adae3d63"} Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.446657 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-2lhsl" podStartSLOduration=2.446639669 podStartE2EDuration="2.446639669s" podCreationTimestamp="2026-02-19 19:41:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:41.441658603 +0000 UTC m=+1401.054008937" watchObservedRunningTime="2026-02-19 19:41:41.446639669 +0000 UTC m=+1401.058989993" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.519427 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5e629ce1-0108-4450-bb62-44ca1d2993b6" (UID: "5e629ce1-0108-4450-bb62-44ca1d2993b6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.520904 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-config" (OuterVolumeSpecName: "config") pod "5e629ce1-0108-4450-bb62-44ca1d2993b6" (UID: "5e629ce1-0108-4450-bb62-44ca1d2993b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.527575 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dvwm\" (UniqueName: \"kubernetes.io/projected/5e629ce1-0108-4450-bb62-44ca1d2993b6-kube-api-access-5dvwm\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.542950 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e629ce1-0108-4450-bb62-44ca1d2993b6" (UID: "5e629ce1-0108-4450-bb62-44ca1d2993b6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.544708 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5e629ce1-0108-4450-bb62-44ca1d2993b6" (UID: "5e629ce1-0108-4450-bb62-44ca1d2993b6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.582613 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5e629ce1-0108-4450-bb62-44ca1d2993b6" (UID: "5e629ce1-0108-4450-bb62-44ca1d2993b6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.630008 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.630050 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.630062 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.630075 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.630086 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e629ce1-0108-4450-bb62-44ca1d2993b6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.632626 4722 scope.go:117] "RemoveContainer" containerID="3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.669216 4722 scope.go:117] "RemoveContainer" containerID="42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e" Feb 19 19:41:41 crc kubenswrapper[4722]: E0219 19:41:41.670024 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e\": container with ID starting with 42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e not found: ID does not exist" containerID="42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.670078 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e"} err="failed to get container status \"42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e\": rpc error: code = NotFound desc = could not find container \"42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e\": container with ID starting with 42c7d28a4a823825334e8609cbe3ef950913993a5d1b9e06632fe4134cf9488e not found: ID does not exist" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.670105 4722 scope.go:117] "RemoveContainer" containerID="3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b" Feb 19 19:41:41 crc kubenswrapper[4722]: E0219 19:41:41.670808 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b\": container with ID starting with 3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b not found: ID does not exist" containerID="3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.670867 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b"} err="failed to get container status \"3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b\": rpc error: code = NotFound desc = could not find container \"3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b\": container with ID starting with 3881b8420c7618fea838fa8a6e33b4134169541a7d8bf1d0784296cdaac71b4b not found: ID does not exist" Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.767416 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-5cmk8"] Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.786430 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-5cmk8"] Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.798435 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:41:41 crc kubenswrapper[4722]: I0219 19:41:41.798486 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:41:42 crc kubenswrapper[4722]: I0219 19:41:42.437790 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37","Type":"ContainerStarted","Data":"4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f"} Feb 19 19:41:43 crc kubenswrapper[4722]: I0219 19:41:43.087107 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e629ce1-0108-4450-bb62-44ca1d2993b6" path="/var/lib/kubelet/pods/5e629ce1-0108-4450-bb62-44ca1d2993b6/volumes" Feb 19 19:41:44 crc kubenswrapper[4722]: I0219 19:41:44.458767 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37","Type":"ContainerStarted","Data":"4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4"} Feb 19 19:41:44 crc kubenswrapper[4722]: I0219 19:41:44.459508 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 19:41:44 crc kubenswrapper[4722]: I0219 19:41:44.493280 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8494755619999999 podStartE2EDuration="6.49325827s" podCreationTimestamp="2026-02-19 19:41:38 +0000 UTC" firstStartedPulling="2026-02-19 19:41:39.443989113 +0000 UTC m=+1399.056339447" lastFinishedPulling="2026-02-19 19:41:44.087771831 +0000 UTC m=+1403.700122155" observedRunningTime="2026-02-19 19:41:44.480348426 +0000 UTC m=+1404.092698750" watchObservedRunningTime="2026-02-19 19:41:44.49325827 +0000 UTC m=+1404.105608594" Feb 19 19:41:46 crc kubenswrapper[4722]: I0219 19:41:46.481189 4722 generic.go:334] "Generic (PLEG): container finished" podID="ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8" containerID="1b22b481f4ab7fb0f4e181aec8382fbfd29168cb889f08d4a7d81841adae3d63" exitCode=0 Feb 19 19:41:46 crc kubenswrapper[4722]: I0219 19:41:46.481384 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2lhsl" event={"ID":"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8","Type":"ContainerDied","Data":"1b22b481f4ab7fb0f4e181aec8382fbfd29168cb889f08d4a7d81841adae3d63"} Feb 19 19:41:47 crc kubenswrapper[4722]: I0219 19:41:47.755415 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 19:41:47 crc kubenswrapper[4722]: I0219 19:41:47.755928 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 19:41:47 crc kubenswrapper[4722]: I0219 19:41:47.917786 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:47 crc kubenswrapper[4722]: I0219 19:41:47.964553 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cqv2\" (UniqueName: \"kubernetes.io/projected/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-kube-api-access-4cqv2\") pod \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " Feb 19 19:41:47 crc kubenswrapper[4722]: I0219 19:41:47.964625 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-scripts\") pod \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " Feb 19 19:41:47 crc kubenswrapper[4722]: I0219 19:41:47.964822 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-combined-ca-bundle\") pod \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " Feb 19 19:41:47 crc kubenswrapper[4722]: I0219 19:41:47.964881 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-config-data\") pod \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\" (UID: \"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8\") " Feb 19 19:41:47 crc kubenswrapper[4722]: I0219 19:41:47.971116 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-kube-api-access-4cqv2" (OuterVolumeSpecName: "kube-api-access-4cqv2") pod "ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8" (UID: "ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8"). InnerVolumeSpecName "kube-api-access-4cqv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:47 crc kubenswrapper[4722]: I0219 19:41:47.974261 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-scripts" (OuterVolumeSpecName: "scripts") pod "ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8" (UID: "ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:47 crc kubenswrapper[4722]: I0219 19:41:47.998433 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-config-data" (OuterVolumeSpecName: "config-data") pod "ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8" (UID: "ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.023317 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8" (UID: "ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.067618 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.067653 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cqv2\" (UniqueName: \"kubernetes.io/projected/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-kube-api-access-4cqv2\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.067665 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.067673 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.544677 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2lhsl" event={"ID":"ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8","Type":"ContainerDied","Data":"a029982f23c93dbc7ae7b59f97f09872491299ab2ae9f8e1237d81964fadfc32"} Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.544935 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a029982f23c93dbc7ae7b59f97f09872491299ab2ae9f8e1237d81964fadfc32" Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.544983 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2lhsl" Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.693299 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.693600 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="339423c2-068b-48f8-8117-04f6a37ceaf9" containerName="nova-scheduler-scheduler" containerID="cri-o://fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40" gracePeriod=30 Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.726352 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.726597 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerName="nova-api-log" containerID="cri-o://7dfeaedcfdea401372ee480bf2951f7e18eed6383fb5d16dc006019bff38c3a8" gracePeriod=30 Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.726670 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerName="nova-api-api" containerID="cri-o://8b7451defcdb3c6dd2e97b1753af390c2ae8fee12578c7f793efbbbccdf699e8" gracePeriod=30 Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.754308 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.226:8774/\": EOF" Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.758307 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.226:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.770093 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.770406 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-log" containerID="cri-o://9e174e4ac1407291135c0ab1018e954feca197504199df9de4977de4b585f9e2" gracePeriod=30 Feb 19 19:41:48 crc kubenswrapper[4722]: I0219 19:41:48.770754 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-metadata" containerID="cri-o://a02b0085e2ba8a5b6e93ff14529363efb51fc8a03bd2360fc27b7d63f1740346" gracePeriod=30 Feb 19 19:41:49 crc kubenswrapper[4722]: I0219 19:41:49.555295 4722 generic.go:334] "Generic (PLEG): container finished" podID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerID="7dfeaedcfdea401372ee480bf2951f7e18eed6383fb5d16dc006019bff38c3a8" exitCode=143 Feb 19 19:41:49 crc kubenswrapper[4722]: I0219 19:41:49.555368 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cd6b11a-72fb-4116-8ccf-aee449ab564a","Type":"ContainerDied","Data":"7dfeaedcfdea401372ee480bf2951f7e18eed6383fb5d16dc006019bff38c3a8"} Feb 19 19:41:49 crc kubenswrapper[4722]: I0219 19:41:49.557692 4722 generic.go:334] "Generic (PLEG): container finished" podID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerID="9e174e4ac1407291135c0ab1018e954feca197504199df9de4977de4b585f9e2" exitCode=143 Feb 19 19:41:49 crc kubenswrapper[4722]: I0219 19:41:49.557736 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9140da-76d7-4109-9892-23c1ceb60eaa","Type":"ContainerDied","Data":"9e174e4ac1407291135c0ab1018e954feca197504199df9de4977de4b585f9e2"} Feb 19 19:41:50 crc kubenswrapper[4722]: E0219 19:41:50.080717 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40 is running failed: container process not found" containerID="fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 19:41:50 crc kubenswrapper[4722]: E0219 19:41:50.081123 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40 is running failed: container process not found" containerID="fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 19:41:50 crc kubenswrapper[4722]: E0219 19:41:50.081326 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40 is running failed: container process not found" containerID="fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 19 19:41:50 crc kubenswrapper[4722]: E0219 19:41:50.081352 4722 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="339423c2-068b-48f8-8117-04f6a37ceaf9" containerName="nova-scheduler-scheduler" Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.210919 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.314962 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjj24\" (UniqueName: \"kubernetes.io/projected/339423c2-068b-48f8-8117-04f6a37ceaf9-kube-api-access-qjj24\") pod \"339423c2-068b-48f8-8117-04f6a37ceaf9\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.315030 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-combined-ca-bundle\") pod \"339423c2-068b-48f8-8117-04f6a37ceaf9\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.315100 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-config-data\") pod \"339423c2-068b-48f8-8117-04f6a37ceaf9\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.321103 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/339423c2-068b-48f8-8117-04f6a37ceaf9-kube-api-access-qjj24" (OuterVolumeSpecName: "kube-api-access-qjj24") pod "339423c2-068b-48f8-8117-04f6a37ceaf9" (UID: "339423c2-068b-48f8-8117-04f6a37ceaf9"). InnerVolumeSpecName "kube-api-access-qjj24". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:50 crc kubenswrapper[4722]: E0219 19:41:50.360615 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-config-data podName:339423c2-068b-48f8-8117-04f6a37ceaf9 nodeName:}" failed. No retries permitted until 2026-02-19 19:41:50.860584784 +0000 UTC m=+1410.472935098 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-config-data") pod "339423c2-068b-48f8-8117-04f6a37ceaf9" (UID: "339423c2-068b-48f8-8117-04f6a37ceaf9") : error deleting /var/lib/kubelet/pods/339423c2-068b-48f8-8117-04f6a37ceaf9/volume-subpaths: remove /var/lib/kubelet/pods/339423c2-068b-48f8-8117-04f6a37ceaf9/volume-subpaths: no such file or directory Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.365322 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "339423c2-068b-48f8-8117-04f6a37ceaf9" (UID: "339423c2-068b-48f8-8117-04f6a37ceaf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.417208 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjj24\" (UniqueName: \"kubernetes.io/projected/339423c2-068b-48f8-8117-04f6a37ceaf9-kube-api-access-qjj24\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.417242 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.567543 4722 generic.go:334] "Generic (PLEG): container finished" podID="339423c2-068b-48f8-8117-04f6a37ceaf9" containerID="fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40" exitCode=0 Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.567583 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"339423c2-068b-48f8-8117-04f6a37ceaf9","Type":"ContainerDied","Data":"fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40"} Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.567608 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"339423c2-068b-48f8-8117-04f6a37ceaf9","Type":"ContainerDied","Data":"b861b10dd2203a847e9aec1463641c9c77079ce2640617c70a1b02fbcb8c691f"} Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.567614 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.567625 4722 scope.go:117] "RemoveContainer" containerID="fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40" Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.588798 4722 scope.go:117] "RemoveContainer" containerID="fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40" Feb 19 19:41:50 crc kubenswrapper[4722]: E0219 19:41:50.589312 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40\": container with ID starting with fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40 not found: ID does not exist" containerID="fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40" Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.589367 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40"} err="failed to get container status \"fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40\": rpc error: code = NotFound desc = could not find container \"fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40\": container with ID starting with fe9056b99aa2aa1ebf11406c4f5712d0080dabde56a5cf3ed9c1e81190c4ef40 not found: ID does not exist" Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.929021 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-config-data\") pod \"339423c2-068b-48f8-8117-04f6a37ceaf9\" (UID: \"339423c2-068b-48f8-8117-04f6a37ceaf9\") " Feb 19 19:41:50 crc kubenswrapper[4722]: I0219 19:41:50.933345 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-config-data" (OuterVolumeSpecName: "config-data") pod "339423c2-068b-48f8-8117-04f6a37ceaf9" (UID: "339423c2-068b-48f8-8117-04f6a37ceaf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.031595 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/339423c2-068b-48f8-8117-04f6a37ceaf9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.200884 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.223474 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.242303 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:51 crc kubenswrapper[4722]: E0219 19:41:51.242788 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e629ce1-0108-4450-bb62-44ca1d2993b6" containerName="dnsmasq-dns" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.242808 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e629ce1-0108-4450-bb62-44ca1d2993b6" containerName="dnsmasq-dns" Feb 19 19:41:51 crc kubenswrapper[4722]: E0219 19:41:51.242829 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e629ce1-0108-4450-bb62-44ca1d2993b6" containerName="init" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.242837 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e629ce1-0108-4450-bb62-44ca1d2993b6" containerName="init" Feb 19 19:41:51 crc kubenswrapper[4722]: E0219 19:41:51.242850 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="339423c2-068b-48f8-8117-04f6a37ceaf9" containerName="nova-scheduler-scheduler" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.242858 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="339423c2-068b-48f8-8117-04f6a37ceaf9" containerName="nova-scheduler-scheduler" Feb 19 19:41:51 crc kubenswrapper[4722]: E0219 19:41:51.242876 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8" containerName="nova-manage" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.242886 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8" containerName="nova-manage" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.243112 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e629ce1-0108-4450-bb62-44ca1d2993b6" containerName="dnsmasq-dns" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.243124 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="339423c2-068b-48f8-8117-04f6a37ceaf9" containerName="nova-scheduler-scheduler" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.243132 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8" containerName="nova-manage" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.243959 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.250378 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.259291 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.352135 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f1f5c9a-dacb-45b5-95bf-2e62a12a908b-config-data\") pod \"nova-scheduler-0\" (UID: \"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.352511 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f1f5c9a-dacb-45b5-95bf-2e62a12a908b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.352671 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcdrh\" (UniqueName: \"kubernetes.io/projected/6f1f5c9a-dacb-45b5-95bf-2e62a12a908b-kube-api-access-bcdrh\") pod \"nova-scheduler-0\" (UID: \"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.454797 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f1f5c9a-dacb-45b5-95bf-2e62a12a908b-config-data\") pod \"nova-scheduler-0\" (UID: \"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.455269 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f1f5c9a-dacb-45b5-95bf-2e62a12a908b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.455830 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcdrh\" (UniqueName: \"kubernetes.io/projected/6f1f5c9a-dacb-45b5-95bf-2e62a12a908b-kube-api-access-bcdrh\") pod \"nova-scheduler-0\" (UID: \"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.460081 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f1f5c9a-dacb-45b5-95bf-2e62a12a908b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.462907 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f1f5c9a-dacb-45b5-95bf-2e62a12a908b-config-data\") pod \"nova-scheduler-0\" (UID: \"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.476332 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcdrh\" (UniqueName: \"kubernetes.io/projected/6f1f5c9a-dacb-45b5-95bf-2e62a12a908b-kube-api-access-bcdrh\") pod \"nova-scheduler-0\" (UID: \"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b\") " pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.566827 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.912180 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": read tcp 10.217.0.2:33674->10.217.0.217:8775: read: connection reset by peer" Feb 19 19:41:51 crc kubenswrapper[4722]: I0219 19:41:51.912497 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": read tcp 10.217.0.2:33684->10.217.0.217:8775: read: connection reset by peer" Feb 19 19:41:52 crc kubenswrapper[4722]: I0219 19:41:52.006328 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 19:41:52 crc kubenswrapper[4722]: W0219 19:41:52.011497 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f1f5c9a_dacb_45b5_95bf_2e62a12a908b.slice/crio-5aa2158364b2aa5e848b3909be7fd3cc97183bc657d0a9b78e54c0921d39e6c7 WatchSource:0}: Error finding container 5aa2158364b2aa5e848b3909be7fd3cc97183bc657d0a9b78e54c0921d39e6c7: Status 404 returned error can't find the container with id 5aa2158364b2aa5e848b3909be7fd3cc97183bc657d0a9b78e54c0921d39e6c7 Feb 19 19:41:52 crc kubenswrapper[4722]: I0219 19:41:52.594897 4722 generic.go:334] "Generic (PLEG): container finished" podID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerID="a02b0085e2ba8a5b6e93ff14529363efb51fc8a03bd2360fc27b7d63f1740346" exitCode=0 Feb 19 19:41:52 crc kubenswrapper[4722]: I0219 19:41:52.594998 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9140da-76d7-4109-9892-23c1ceb60eaa","Type":"ContainerDied","Data":"a02b0085e2ba8a5b6e93ff14529363efb51fc8a03bd2360fc27b7d63f1740346"} Feb 19 19:41:52 crc kubenswrapper[4722]: I0219 19:41:52.597664 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b","Type":"ContainerStarted","Data":"634cc860abf2b738b05ad29a3d1f796b3e6b10bd722e31641e2a789979bc2e2e"} Feb 19 19:41:52 crc kubenswrapper[4722]: I0219 19:41:52.597710 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6f1f5c9a-dacb-45b5-95bf-2e62a12a908b","Type":"ContainerStarted","Data":"5aa2158364b2aa5e848b3909be7fd3cc97183bc657d0a9b78e54c0921d39e6c7"} Feb 19 19:41:52 crc kubenswrapper[4722]: I0219 19:41:52.618722 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.618696181 podStartE2EDuration="1.618696181s" podCreationTimestamp="2026-02-19 19:41:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:52.613886941 +0000 UTC m=+1412.226237275" watchObservedRunningTime="2026-02-19 19:41:52.618696181 +0000 UTC m=+1412.231046505" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.088987 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="339423c2-068b-48f8-8117-04f6a37ceaf9" path="/var/lib/kubelet/pods/339423c2-068b-48f8-8117-04f6a37ceaf9/volumes" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.127473 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.290170 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-nova-metadata-tls-certs\") pod \"3f9140da-76d7-4109-9892-23c1ceb60eaa\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.290347 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-config-data\") pod \"3f9140da-76d7-4109-9892-23c1ceb60eaa\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.290403 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9cmx\" (UniqueName: \"kubernetes.io/projected/3f9140da-76d7-4109-9892-23c1ceb60eaa-kube-api-access-z9cmx\") pod \"3f9140da-76d7-4109-9892-23c1ceb60eaa\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.290425 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-combined-ca-bundle\") pod \"3f9140da-76d7-4109-9892-23c1ceb60eaa\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.290528 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f9140da-76d7-4109-9892-23c1ceb60eaa-logs\") pod \"3f9140da-76d7-4109-9892-23c1ceb60eaa\" (UID: \"3f9140da-76d7-4109-9892-23c1ceb60eaa\") " Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.291770 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f9140da-76d7-4109-9892-23c1ceb60eaa-logs" (OuterVolumeSpecName: "logs") pod "3f9140da-76d7-4109-9892-23c1ceb60eaa" (UID: "3f9140da-76d7-4109-9892-23c1ceb60eaa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.296060 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f9140da-76d7-4109-9892-23c1ceb60eaa-kube-api-access-z9cmx" (OuterVolumeSpecName: "kube-api-access-z9cmx") pod "3f9140da-76d7-4109-9892-23c1ceb60eaa" (UID: "3f9140da-76d7-4109-9892-23c1ceb60eaa"). InnerVolumeSpecName "kube-api-access-z9cmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.334308 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f9140da-76d7-4109-9892-23c1ceb60eaa" (UID: "3f9140da-76d7-4109-9892-23c1ceb60eaa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.339297 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-config-data" (OuterVolumeSpecName: "config-data") pod "3f9140da-76d7-4109-9892-23c1ceb60eaa" (UID: "3f9140da-76d7-4109-9892-23c1ceb60eaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.362408 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3f9140da-76d7-4109-9892-23c1ceb60eaa" (UID: "3f9140da-76d7-4109-9892-23c1ceb60eaa"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.394383 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f9140da-76d7-4109-9892-23c1ceb60eaa-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.394426 4722 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.394443 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.394456 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9cmx\" (UniqueName: \"kubernetes.io/projected/3f9140da-76d7-4109-9892-23c1ceb60eaa-kube-api-access-z9cmx\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.394469 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9140da-76d7-4109-9892-23c1ceb60eaa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.628823 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.634493 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9140da-76d7-4109-9892-23c1ceb60eaa","Type":"ContainerDied","Data":"1d5b78abbb5e2a59e1b1457349c70f190753858ae936d0ab19bb55cc724af44f"} Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.634864 4722 scope.go:117] "RemoveContainer" containerID="a02b0085e2ba8a5b6e93ff14529363efb51fc8a03bd2360fc27b7d63f1740346" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.683211 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.704884 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.726216 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:53 crc kubenswrapper[4722]: E0219 19:41:53.726734 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-log" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.726753 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-log" Feb 19 19:41:53 crc kubenswrapper[4722]: E0219 19:41:53.726769 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-metadata" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.726776 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-metadata" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.726971 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-log" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.726994 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" containerName="nova-metadata-metadata" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.728064 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.731195 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.743702 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.748011 4722 scope.go:117] "RemoveContainer" containerID="9e174e4ac1407291135c0ab1018e954feca197504199df9de4977de4b585f9e2" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.765545 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.803883 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmmhf\" (UniqueName: \"kubernetes.io/projected/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-kube-api-access-hmmhf\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.803929 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-config-data\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.804031 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-logs\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.804111 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.804131 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.905699 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.905944 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.906071 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmmhf\" (UniqueName: \"kubernetes.io/projected/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-kube-api-access-hmmhf\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.906490 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-config-data\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.907016 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-logs\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.907539 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-logs\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.917373 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.917715 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-config-data\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.930823 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:53 crc kubenswrapper[4722]: I0219 19:41:53.944675 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmmhf\" (UniqueName: \"kubernetes.io/projected/5f2a647c-7a68-4e2c-aabf-b18973b20ad0-kube-api-access-hmmhf\") pod \"nova-metadata-0\" (UID: \"5f2a647c-7a68-4e2c-aabf-b18973b20ad0\") " pod="openstack/nova-metadata-0" Feb 19 19:41:54 crc kubenswrapper[4722]: I0219 19:41:54.083348 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 19:41:54 crc kubenswrapper[4722]: I0219 19:41:54.584678 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 19:41:54 crc kubenswrapper[4722]: I0219 19:41:54.638906 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5f2a647c-7a68-4e2c-aabf-b18973b20ad0","Type":"ContainerStarted","Data":"b195759e300cb1e12ca80f40c8381b49a1330fb39c6de4f6257f2bf3956e25e5"} Feb 19 19:41:54 crc kubenswrapper[4722]: I0219 19:41:54.642105 4722 generic.go:334] "Generic (PLEG): container finished" podID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerID="8b7451defcdb3c6dd2e97b1753af390c2ae8fee12578c7f793efbbbccdf699e8" exitCode=0 Feb 19 19:41:54 crc kubenswrapper[4722]: I0219 19:41:54.642300 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cd6b11a-72fb-4116-8ccf-aee449ab564a","Type":"ContainerDied","Data":"8b7451defcdb3c6dd2e97b1753af390c2ae8fee12578c7f793efbbbccdf699e8"} Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.089602 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f9140da-76d7-4109-9892-23c1ceb60eaa" path="/var/lib/kubelet/pods/3f9140da-76d7-4109-9892-23c1ceb60eaa/volumes" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.458179 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.646477 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-internal-tls-certs\") pod \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.647021 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd6b11a-72fb-4116-8ccf-aee449ab564a-logs\") pod \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.647107 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v5gc\" (UniqueName: \"kubernetes.io/projected/7cd6b11a-72fb-4116-8ccf-aee449ab564a-kube-api-access-8v5gc\") pod \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.647172 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-config-data\") pod \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.647317 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-public-tls-certs\") pod \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.647427 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-combined-ca-bundle\") pod \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\" (UID: \"7cd6b11a-72fb-4116-8ccf-aee449ab564a\") " Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.647630 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cd6b11a-72fb-4116-8ccf-aee449ab564a-logs" (OuterVolumeSpecName: "logs") pod "7cd6b11a-72fb-4116-8ccf-aee449ab564a" (UID: "7cd6b11a-72fb-4116-8ccf-aee449ab564a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.648204 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cd6b11a-72fb-4116-8ccf-aee449ab564a-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.650816 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cd6b11a-72fb-4116-8ccf-aee449ab564a-kube-api-access-8v5gc" (OuterVolumeSpecName: "kube-api-access-8v5gc") pod "7cd6b11a-72fb-4116-8ccf-aee449ab564a" (UID: "7cd6b11a-72fb-4116-8ccf-aee449ab564a"). InnerVolumeSpecName "kube-api-access-8v5gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.657109 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7cd6b11a-72fb-4116-8ccf-aee449ab564a","Type":"ContainerDied","Data":"169addbbb50b70578d635a167ff6f9ab09ad19713f22931dee2d5c96156a2632"} Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.657182 4722 scope.go:117] "RemoveContainer" containerID="8b7451defcdb3c6dd2e97b1753af390c2ae8fee12578c7f793efbbbccdf699e8" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.657369 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.659498 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5f2a647c-7a68-4e2c-aabf-b18973b20ad0","Type":"ContainerStarted","Data":"8fbd7fe758e7bb8bd2bbeba5443112a38e88a0b29c952089007fe29f8e022d9d"} Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.659581 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5f2a647c-7a68-4e2c-aabf-b18973b20ad0","Type":"ContainerStarted","Data":"09c2109fd234ad7bb2ce0190198df4bda0db494ce8d35f3e782b36fe08e9d5b9"} Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.679762 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-config-data" (OuterVolumeSpecName: "config-data") pod "7cd6b11a-72fb-4116-8ccf-aee449ab564a" (UID: "7cd6b11a-72fb-4116-8ccf-aee449ab564a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.691738 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.691719117 podStartE2EDuration="2.691719117s" podCreationTimestamp="2026-02-19 19:41:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:55.68028371 +0000 UTC m=+1415.292634044" watchObservedRunningTime="2026-02-19 19:41:55.691719117 +0000 UTC m=+1415.304069441" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.692944 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cd6b11a-72fb-4116-8ccf-aee449ab564a" (UID: "7cd6b11a-72fb-4116-8ccf-aee449ab564a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.693395 4722 scope.go:117] "RemoveContainer" containerID="7dfeaedcfdea401372ee480bf2951f7e18eed6383fb5d16dc006019bff38c3a8" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.717885 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7cd6b11a-72fb-4116-8ccf-aee449ab564a" (UID: "7cd6b11a-72fb-4116-8ccf-aee449ab564a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.728399 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7cd6b11a-72fb-4116-8ccf-aee449ab564a" (UID: "7cd6b11a-72fb-4116-8ccf-aee449ab564a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.750344 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v5gc\" (UniqueName: \"kubernetes.io/projected/7cd6b11a-72fb-4116-8ccf-aee449ab564a-kube-api-access-8v5gc\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.750376 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.750389 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.750399 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.750409 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cd6b11a-72fb-4116-8ccf-aee449ab564a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:41:55 crc kubenswrapper[4722]: I0219 19:41:55.993303 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.008975 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.024912 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:56 crc kubenswrapper[4722]: E0219 19:41:56.025396 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerName="nova-api-log" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.025420 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerName="nova-api-log" Feb 19 19:41:56 crc kubenswrapper[4722]: E0219 19:41:56.025440 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerName="nova-api-api" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.025448 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerName="nova-api-api" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.025696 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerName="nova-api-api" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.025739 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" containerName="nova-api-log" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.034609 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.037967 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.038278 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.038481 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.040964 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.057622 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.057712 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgzvt\" (UniqueName: \"kubernetes.io/projected/5aaacc6a-6882-467d-b66f-0178ccd35955-kube-api-access-fgzvt\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.057733 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-public-tls-certs\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.057766 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-config-data\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.057896 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5aaacc6a-6882-467d-b66f-0178ccd35955-logs\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.057943 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.159074 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.160213 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgzvt\" (UniqueName: \"kubernetes.io/projected/5aaacc6a-6882-467d-b66f-0178ccd35955-kube-api-access-fgzvt\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.160257 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-public-tls-certs\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.160338 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-config-data\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.160752 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5aaacc6a-6882-467d-b66f-0178ccd35955-logs\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.160847 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.161328 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5aaacc6a-6882-467d-b66f-0178ccd35955-logs\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.164850 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.167833 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-internal-tls-certs\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.168213 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-public-tls-certs\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.173189 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5aaacc6a-6882-467d-b66f-0178ccd35955-config-data\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.177800 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgzvt\" (UniqueName: \"kubernetes.io/projected/5aaacc6a-6882-467d-b66f-0178ccd35955-kube-api-access-fgzvt\") pod \"nova-api-0\" (UID: \"5aaacc6a-6882-467d-b66f-0178ccd35955\") " pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.361083 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.567060 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 19:41:56 crc kubenswrapper[4722]: I0219 19:41:56.857766 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 19:41:57 crc kubenswrapper[4722]: I0219 19:41:57.082895 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cd6b11a-72fb-4116-8ccf-aee449ab564a" path="/var/lib/kubelet/pods/7cd6b11a-72fb-4116-8ccf-aee449ab564a/volumes" Feb 19 19:41:57 crc kubenswrapper[4722]: I0219 19:41:57.685968 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5aaacc6a-6882-467d-b66f-0178ccd35955","Type":"ContainerStarted","Data":"448fb2217cc376c1b0880710bf5dcf49e19a7dc4a1e40b77bafda98677e76a38"} Feb 19 19:41:57 crc kubenswrapper[4722]: I0219 19:41:57.686283 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5aaacc6a-6882-467d-b66f-0178ccd35955","Type":"ContainerStarted","Data":"8a415958535c5343d60761a608a136b6e4e7fac82bf7cd448cdb9af0e6aeb3e9"} Feb 19 19:41:57 crc kubenswrapper[4722]: I0219 19:41:57.686295 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5aaacc6a-6882-467d-b66f-0178ccd35955","Type":"ContainerStarted","Data":"800e57beaef9bb1445abc3a44c898eaed106d2e06403c3ee50dc90d556b51a15"} Feb 19 19:41:57 crc kubenswrapper[4722]: I0219 19:41:57.716190 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.716174223 podStartE2EDuration="2.716174223s" podCreationTimestamp="2026-02-19 19:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:41:57.714950916 +0000 UTC m=+1417.327301240" watchObservedRunningTime="2026-02-19 19:41:57.716174223 +0000 UTC m=+1417.328524547" Feb 19 19:41:59 crc kubenswrapper[4722]: I0219 19:41:59.086885 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 19:41:59 crc kubenswrapper[4722]: I0219 19:41:59.086963 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 19:42:01 crc kubenswrapper[4722]: I0219 19:42:01.567244 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 19:42:01 crc kubenswrapper[4722]: I0219 19:42:01.595750 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 19:42:01 crc kubenswrapper[4722]: I0219 19:42:01.784555 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 19:42:04 crc kubenswrapper[4722]: I0219 19:42:04.084340 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 19:42:04 crc kubenswrapper[4722]: I0219 19:42:04.084691 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 19:42:05 crc kubenswrapper[4722]: I0219 19:42:05.096355 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5f2a647c-7a68-4e2c-aabf-b18973b20ad0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.230:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 19:42:05 crc kubenswrapper[4722]: I0219 19:42:05.096430 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5f2a647c-7a68-4e2c-aabf-b18973b20ad0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.230:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 19:42:06 crc kubenswrapper[4722]: I0219 19:42:06.362059 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 19:42:06 crc kubenswrapper[4722]: I0219 19:42:06.362123 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 19:42:07 crc kubenswrapper[4722]: I0219 19:42:07.374410 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5aaacc6a-6882-467d-b66f-0178ccd35955" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.231:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 19:42:07 crc kubenswrapper[4722]: I0219 19:42:07.374438 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5aaacc6a-6882-467d-b66f-0178ccd35955" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.231:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 19:42:08 crc kubenswrapper[4722]: I0219 19:42:08.897907 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 19:42:11 crc kubenswrapper[4722]: I0219 19:42:11.798870 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:42:11 crc kubenswrapper[4722]: I0219 19:42:11.799140 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:42:14 crc kubenswrapper[4722]: I0219 19:42:14.088272 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 19:42:14 crc kubenswrapper[4722]: I0219 19:42:14.088892 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 19:42:14 crc kubenswrapper[4722]: I0219 19:42:14.094076 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 19:42:14 crc kubenswrapper[4722]: I0219 19:42:14.883216 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 19:42:16 crc kubenswrapper[4722]: I0219 19:42:16.370118 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 19:42:16 crc kubenswrapper[4722]: I0219 19:42:16.371061 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 19:42:16 crc kubenswrapper[4722]: I0219 19:42:16.372780 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 19:42:16 crc kubenswrapper[4722]: I0219 19:42:16.391003 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 19:42:16 crc kubenswrapper[4722]: I0219 19:42:16.895424 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 19:42:16 crc kubenswrapper[4722]: I0219 19:42:16.905654 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.148477 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-xdgs2"] Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.159136 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-xdgs2"] Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.236778 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-rwjf7"] Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.238356 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.240199 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.249563 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-rwjf7"] Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.394978 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-config-data\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.395061 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpfg6\" (UniqueName: \"kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-kube-api-access-vpfg6\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.395226 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-combined-ca-bundle\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.395278 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-scripts\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.395332 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-certs\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.496927 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-config-data\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.497028 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpfg6\" (UniqueName: \"kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-kube-api-access-vpfg6\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.497220 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-combined-ca-bundle\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.497248 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-scripts\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.497310 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-certs\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.502952 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-scripts\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.503001 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-certs\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.503670 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-combined-ca-bundle\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.504438 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-config-data\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.518781 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpfg6\" (UniqueName: \"kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-kube-api-access-vpfg6\") pod \"cloudkitty-db-sync-rwjf7\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:26 crc kubenswrapper[4722]: I0219 19:42:26.559693 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:27 crc kubenswrapper[4722]: I0219 19:42:27.083819 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb399ce1-7269-4d99-9140-0d1d33a6fd6a" path="/var/lib/kubelet/pods/fb399ce1-7269-4d99-9140-0d1d33a6fd6a/volumes" Feb 19 19:42:27 crc kubenswrapper[4722]: I0219 19:42:27.183218 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-rwjf7"] Feb 19 19:42:27 crc kubenswrapper[4722]: I0219 19:42:27.822844 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:42:27 crc kubenswrapper[4722]: I0219 19:42:27.823425 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="ceilometer-central-agent" containerID="cri-o://28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679" gracePeriod=30 Feb 19 19:42:27 crc kubenswrapper[4722]: I0219 19:42:27.823533 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="proxy-httpd" containerID="cri-o://4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4" gracePeriod=30 Feb 19 19:42:27 crc kubenswrapper[4722]: I0219 19:42:27.823503 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="sg-core" containerID="cri-o://4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f" gracePeriod=30 Feb 19 19:42:27 crc kubenswrapper[4722]: I0219 19:42:27.823546 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="ceilometer-notification-agent" containerID="cri-o://cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d" gracePeriod=30 Feb 19 19:42:28 crc kubenswrapper[4722]: I0219 19:42:28.004998 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-rwjf7" event={"ID":"3eb4da3f-b07b-4b6f-a524-8b2af229ed87","Type":"ContainerStarted","Data":"b01a64e732c528d91886fbc6303b22a5ddc8e59f3b32f31c6e2bfee4be333b08"} Feb 19 19:42:28 crc kubenswrapper[4722]: I0219 19:42:28.005039 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-rwjf7" event={"ID":"3eb4da3f-b07b-4b6f-a524-8b2af229ed87","Type":"ContainerStarted","Data":"b3b56e96586072af8aee5e36c8322fcbbe0a4be38b365b37d145b3ead8b232b8"} Feb 19 19:42:28 crc kubenswrapper[4722]: I0219 19:42:28.008384 4722 generic.go:334] "Generic (PLEG): container finished" podID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerID="4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f" exitCode=2 Feb 19 19:42:28 crc kubenswrapper[4722]: I0219 19:42:28.008440 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37","Type":"ContainerDied","Data":"4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f"} Feb 19 19:42:28 crc kubenswrapper[4722]: I0219 19:42:28.028430 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-rwjf7" podStartSLOduration=1.855650898 podStartE2EDuration="2.028405115s" podCreationTimestamp="2026-02-19 19:42:26 +0000 UTC" firstStartedPulling="2026-02-19 19:42:27.19410131 +0000 UTC m=+1446.806451634" lastFinishedPulling="2026-02-19 19:42:27.366855527 +0000 UTC m=+1446.979205851" observedRunningTime="2026-02-19 19:42:28.019561478 +0000 UTC m=+1447.631911802" watchObservedRunningTime="2026-02-19 19:42:28.028405115 +0000 UTC m=+1447.640755439" Feb 19 19:42:28 crc kubenswrapper[4722]: I0219 19:42:28.565707 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 19:42:28 crc kubenswrapper[4722]: I0219 19:42:28.631393 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.020208 4722 generic.go:334] "Generic (PLEG): container finished" podID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerID="4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4" exitCode=0 Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.020241 4722 generic.go:334] "Generic (PLEG): container finished" podID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerID="28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679" exitCode=0 Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.020506 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37","Type":"ContainerDied","Data":"4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4"} Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.020631 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37","Type":"ContainerDied","Data":"28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679"} Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.883626 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.906735 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-scripts\") pod \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.906841 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-sg-core-conf-yaml\") pod \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.906941 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-ceilometer-tls-certs\") pod \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.906974 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-log-httpd\") pod \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.907014 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz5fq\" (UniqueName: \"kubernetes.io/projected/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-kube-api-access-sz5fq\") pod \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.907041 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-combined-ca-bundle\") pod \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.907100 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-run-httpd\") pod \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.907173 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-config-data\") pod \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\" (UID: \"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37\") " Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.907646 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" (UID: "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.907775 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" (UID: "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.920163 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-scripts" (OuterVolumeSpecName: "scripts") pod "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" (UID: "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:29 crc kubenswrapper[4722]: I0219 19:42:29.928190 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-kube-api-access-sz5fq" (OuterVolumeSpecName: "kube-api-access-sz5fq") pod "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" (UID: "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37"). InnerVolumeSpecName "kube-api-access-sz5fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.002372 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" (UID: "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.009073 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.009721 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.009795 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.009856 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz5fq\" (UniqueName: \"kubernetes.io/projected/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-kube-api-access-sz5fq\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.009912 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.045278 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-config-data" (OuterVolumeSpecName: "config-data") pod "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" (UID: "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.045405 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.045313 4722 generic.go:334] "Generic (PLEG): container finished" podID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerID="cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d" exitCode=0 Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.045665 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37","Type":"ContainerDied","Data":"cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d"} Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.045750 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37","Type":"ContainerDied","Data":"70a6581ff6c48dbe85b738501a440479409b7191194cf27a145b098c06b0a532"} Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.045829 4722 scope.go:117] "RemoveContainer" containerID="4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.055745 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" (UID: "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.094360 4722 scope.go:117] "RemoveContainer" containerID="4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.109443 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" (UID: "fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.111904 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.111930 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.111939 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.125856 4722 scope.go:117] "RemoveContainer" containerID="cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.163429 4722 scope.go:117] "RemoveContainer" containerID="28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.187360 4722 scope.go:117] "RemoveContainer" containerID="4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4" Feb 19 19:42:30 crc kubenswrapper[4722]: E0219 19:42:30.189635 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4\": container with ID starting with 4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4 not found: ID does not exist" containerID="4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.189690 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4"} err="failed to get container status \"4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4\": rpc error: code = NotFound desc = could not find container \"4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4\": container with ID starting with 4ee71b07481ffa29661513ba3507e16fc4662d74b1d8600b2968e90e57d8e7a4 not found: ID does not exist" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.189716 4722 scope.go:117] "RemoveContainer" containerID="4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f" Feb 19 19:42:30 crc kubenswrapper[4722]: E0219 19:42:30.190007 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f\": container with ID starting with 4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f not found: ID does not exist" containerID="4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.190032 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f"} err="failed to get container status \"4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f\": rpc error: code = NotFound desc = could not find container \"4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f\": container with ID starting with 4c2752f9cf400e7f4021f5cccbef89b82a9b455088293f823cd7a1893eee120f not found: ID does not exist" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.190044 4722 scope.go:117] "RemoveContainer" containerID="cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d" Feb 19 19:42:30 crc kubenswrapper[4722]: E0219 19:42:30.192172 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d\": container with ID starting with cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d not found: ID does not exist" containerID="cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.192225 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d"} err="failed to get container status \"cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d\": rpc error: code = NotFound desc = could not find container \"cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d\": container with ID starting with cba49f16d1bed6517310fa478f956442daf8508692c9e7274c9d72ddc479948d not found: ID does not exist" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.192261 4722 scope.go:117] "RemoveContainer" containerID="28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679" Feb 19 19:42:30 crc kubenswrapper[4722]: E0219 19:42:30.192768 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679\": container with ID starting with 28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679 not found: ID does not exist" containerID="28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.192797 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679"} err="failed to get container status \"28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679\": rpc error: code = NotFound desc = could not find container \"28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679\": container with ID starting with 28f4861bb1b6d8d30bc16e2166dfff5069bdce52bc7088dfca6fb1535acf3679 not found: ID does not exist" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.432142 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.441126 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.483436 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:42:30 crc kubenswrapper[4722]: E0219 19:42:30.483984 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="sg-core" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.484045 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="sg-core" Feb 19 19:42:30 crc kubenswrapper[4722]: E0219 19:42:30.484110 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="ceilometer-central-agent" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.484177 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="ceilometer-central-agent" Feb 19 19:42:30 crc kubenswrapper[4722]: E0219 19:42:30.484265 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="proxy-httpd" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.484317 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="proxy-httpd" Feb 19 19:42:30 crc kubenswrapper[4722]: E0219 19:42:30.484370 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="ceilometer-notification-agent" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.484421 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="ceilometer-notification-agent" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.484640 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="ceilometer-central-agent" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.484709 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="ceilometer-notification-agent" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.484764 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="proxy-httpd" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.484819 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" containerName="sg-core" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.515180 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.521667 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8658k\" (UniqueName: \"kubernetes.io/projected/1e7133a0-5642-4b7b-a560-d215b7fd75cd-kube-api-access-8658k\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.521738 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7133a0-5642-4b7b-a560-d215b7fd75cd-run-httpd\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.521779 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.521812 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.521878 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7133a0-5642-4b7b-a560-d215b7fd75cd-log-httpd\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.521914 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.521943 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-config-data\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.521967 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-scripts\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.522571 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.532764 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.538468 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.547599 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.622521 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.622762 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7133a0-5642-4b7b-a560-d215b7fd75cd-log-httpd\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.622840 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.622908 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-config-data\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.622970 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-scripts\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.623100 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8658k\" (UniqueName: \"kubernetes.io/projected/1e7133a0-5642-4b7b-a560-d215b7fd75cd-kube-api-access-8658k\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.623278 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7133a0-5642-4b7b-a560-d215b7fd75cd-run-httpd\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.623386 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.627351 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-config-data\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.628856 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7133a0-5642-4b7b-a560-d215b7fd75cd-log-httpd\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.629544 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1e7133a0-5642-4b7b-a560-d215b7fd75cd-run-httpd\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.630166 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.630222 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.634678 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-scripts\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.640843 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1e7133a0-5642-4b7b-a560-d215b7fd75cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.675239 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8658k\" (UniqueName: \"kubernetes.io/projected/1e7133a0-5642-4b7b-a560-d215b7fd75cd-kube-api-access-8658k\") pod \"ceilometer-0\" (UID: \"1e7133a0-5642-4b7b-a560-d215b7fd75cd\") " pod="openstack/ceilometer-0" Feb 19 19:42:30 crc kubenswrapper[4722]: I0219 19:42:30.876418 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 19:42:31 crc kubenswrapper[4722]: I0219 19:42:31.077478 4722 generic.go:334] "Generic (PLEG): container finished" podID="3eb4da3f-b07b-4b6f-a524-8b2af229ed87" containerID="b01a64e732c528d91886fbc6303b22a5ddc8e59f3b32f31c6e2bfee4be333b08" exitCode=0 Feb 19 19:42:31 crc kubenswrapper[4722]: I0219 19:42:31.087100 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37" path="/var/lib/kubelet/pods/fe6ac8e6-55da-4c9f-b3b3-fc60afc50c37/volumes" Feb 19 19:42:31 crc kubenswrapper[4722]: I0219 19:42:31.088003 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-rwjf7" event={"ID":"3eb4da3f-b07b-4b6f-a524-8b2af229ed87","Type":"ContainerDied","Data":"b01a64e732c528d91886fbc6303b22a5ddc8e59f3b32f31c6e2bfee4be333b08"} Feb 19 19:42:31 crc kubenswrapper[4722]: I0219 19:42:31.406268 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 19:42:32 crc kubenswrapper[4722]: I0219 19:42:32.453723 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e7133a0-5642-4b7b-a560-d215b7fd75cd","Type":"ContainerStarted","Data":"31a8f7404f8354090b9f106b04a82029d92d5b34c33d3d24f9b8b2de1a682a99"} Feb 19 19:42:32 crc kubenswrapper[4722]: I0219 19:42:32.987427 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.155441 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpfg6\" (UniqueName: \"kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-kube-api-access-vpfg6\") pod \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.155494 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-scripts\") pod \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.155571 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-config-data\") pod \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.155602 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-combined-ca-bundle\") pod \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.155735 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-certs\") pod \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\" (UID: \"3eb4da3f-b07b-4b6f-a524-8b2af229ed87\") " Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.166479 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-kube-api-access-vpfg6" (OuterVolumeSpecName: "kube-api-access-vpfg6") pod "3eb4da3f-b07b-4b6f-a524-8b2af229ed87" (UID: "3eb4da3f-b07b-4b6f-a524-8b2af229ed87"). InnerVolumeSpecName "kube-api-access-vpfg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.184825 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-certs" (OuterVolumeSpecName: "certs") pod "3eb4da3f-b07b-4b6f-a524-8b2af229ed87" (UID: "3eb4da3f-b07b-4b6f-a524-8b2af229ed87"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.189356 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-scripts" (OuterVolumeSpecName: "scripts") pod "3eb4da3f-b07b-4b6f-a524-8b2af229ed87" (UID: "3eb4da3f-b07b-4b6f-a524-8b2af229ed87"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.197070 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-config-data" (OuterVolumeSpecName: "config-data") pod "3eb4da3f-b07b-4b6f-a524-8b2af229ed87" (UID: "3eb4da3f-b07b-4b6f-a524-8b2af229ed87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.210435 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3eb4da3f-b07b-4b6f-a524-8b2af229ed87" (UID: "3eb4da3f-b07b-4b6f-a524-8b2af229ed87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.250296 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-86mtg"] Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.259575 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.260023 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.260065 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.260077 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpfg6\" (UniqueName: \"kubernetes.io/projected/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-kube-api-access-vpfg6\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.260086 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3eb4da3f-b07b-4b6f-a524-8b2af229ed87-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.266958 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-86mtg"] Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.364216 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-77rmn"] Feb 19 19:42:33 crc kubenswrapper[4722]: E0219 19:42:33.364612 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb4da3f-b07b-4b6f-a524-8b2af229ed87" containerName="cloudkitty-db-sync" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.364628 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb4da3f-b07b-4b6f-a524-8b2af229ed87" containerName="cloudkitty-db-sync" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.364816 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb4da3f-b07b-4b6f-a524-8b2af229ed87" containerName="cloudkitty-db-sync" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.365752 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.407212 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-77rmn"] Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.474334 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-rwjf7" event={"ID":"3eb4da3f-b07b-4b6f-a524-8b2af229ed87","Type":"ContainerDied","Data":"b3b56e96586072af8aee5e36c8322fcbbe0a4be38b365b37d145b3ead8b232b8"} Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.474371 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3b56e96586072af8aee5e36c8322fcbbe0a4be38b365b37d145b3ead8b232b8" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.474428 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-rwjf7" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.567449 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-scripts\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.567567 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-config-data\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.567593 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-certs\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.567685 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-combined-ca-bundle\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.567711 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55sdv\" (UniqueName: \"kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-kube-api-access-55sdv\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.669646 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-combined-ca-bundle\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.669689 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55sdv\" (UniqueName: \"kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-kube-api-access-55sdv\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.669786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-scripts\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.669839 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-config-data\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.669858 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-certs\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.676749 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-combined-ca-bundle\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.677087 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-scripts\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.684347 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-config-data\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.686113 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-certs\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.697825 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55sdv\" (UniqueName: \"kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-kube-api-access-55sdv\") pod \"cloudkitty-storageinit-77rmn\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:33 crc kubenswrapper[4722]: I0219 19:42:33.983109 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:34 crc kubenswrapper[4722]: I0219 19:42:34.103503 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" containerName="rabbitmq" containerID="cri-o://fa229a7bb206de4ccc0307a479f0fa815abfa412795902c84987eb4df94f0285" gracePeriod=604795 Feb 19 19:42:34 crc kubenswrapper[4722]: I0219 19:42:34.129578 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Feb 19 19:42:34 crc kubenswrapper[4722]: I0219 19:42:34.196115 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" containerName="rabbitmq" containerID="cri-o://37cb328a31e79626446e5419a5f224da9c1e9f252a7b3a3099897e049cefbfc4" gracePeriod=604795 Feb 19 19:42:34 crc kubenswrapper[4722]: I0219 19:42:34.768724 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Feb 19 19:42:35 crc kubenswrapper[4722]: I0219 19:42:35.086513 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1725704f-c153-4de4-9246-87c6a5e878ea" path="/var/lib/kubelet/pods/1725704f-c153-4de4-9246-87c6a5e878ea/volumes" Feb 19 19:42:36 crc kubenswrapper[4722]: I0219 19:42:36.262695 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-77rmn"] Feb 19 19:42:36 crc kubenswrapper[4722]: I0219 19:42:36.514600 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e7133a0-5642-4b7b-a560-d215b7fd75cd","Type":"ContainerStarted","Data":"a004167d31b52f8c806e89b20758fe28662af134f7e6e33b89fe228caeb98f77"} Feb 19 19:42:36 crc kubenswrapper[4722]: I0219 19:42:36.528076 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-77rmn" event={"ID":"04e19f64-06d2-4c0e-b33c-000fea5deb27","Type":"ContainerStarted","Data":"362971b9a3d43c23fcf4d469e680dc4a2402d33c6f74122b268ea70290fba5b3"} Feb 19 19:42:37 crc kubenswrapper[4722]: I0219 19:42:37.542260 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e7133a0-5642-4b7b-a560-d215b7fd75cd","Type":"ContainerStarted","Data":"6a5cb7d2117f83b267ee6ac4e148de6215240af48d16642017360e18757f8212"} Feb 19 19:42:37 crc kubenswrapper[4722]: I0219 19:42:37.542745 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e7133a0-5642-4b7b-a560-d215b7fd75cd","Type":"ContainerStarted","Data":"48de6046a4b3a2b298a2baa908f92364157f6b322e36c252526912b33b7a56d1"} Feb 19 19:42:37 crc kubenswrapper[4722]: I0219 19:42:37.543960 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-77rmn" event={"ID":"04e19f64-06d2-4c0e-b33c-000fea5deb27","Type":"ContainerStarted","Data":"6218fa1a82ddfb695bd10c74992d9f549d06629abf7b79a733758e50532f43fb"} Feb 19 19:42:37 crc kubenswrapper[4722]: I0219 19:42:37.586975 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-77rmn" podStartSLOduration=4.586954189 podStartE2EDuration="4.586954189s" podCreationTimestamp="2026-02-19 19:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:42:37.563833756 +0000 UTC m=+1457.176184080" watchObservedRunningTime="2026-02-19 19:42:37.586954189 +0000 UTC m=+1457.199304523" Feb 19 19:42:38 crc kubenswrapper[4722]: I0219 19:42:38.555164 4722 generic.go:334] "Generic (PLEG): container finished" podID="04e19f64-06d2-4c0e-b33c-000fea5deb27" containerID="6218fa1a82ddfb695bd10c74992d9f549d06629abf7b79a733758e50532f43fb" exitCode=0 Feb 19 19:42:38 crc kubenswrapper[4722]: I0219 19:42:38.555267 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-77rmn" event={"ID":"04e19f64-06d2-4c0e-b33c-000fea5deb27","Type":"ContainerDied","Data":"6218fa1a82ddfb695bd10c74992d9f549d06629abf7b79a733758e50532f43fb"} Feb 19 19:42:39 crc kubenswrapper[4722]: I0219 19:42:39.575412 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1e7133a0-5642-4b7b-a560-d215b7fd75cd","Type":"ContainerStarted","Data":"93569e7cbcdc63a223dc3937f2e787df75351ff24e9699ce9ca7c60fe16bb23b"} Feb 19 19:42:39 crc kubenswrapper[4722]: I0219 19:42:39.598859 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.197036399 podStartE2EDuration="9.598840793s" podCreationTimestamp="2026-02-19 19:42:30 +0000 UTC" firstStartedPulling="2026-02-19 19:42:31.410205187 +0000 UTC m=+1451.022555511" lastFinishedPulling="2026-02-19 19:42:38.812009581 +0000 UTC m=+1458.424359905" observedRunningTime="2026-02-19 19:42:39.59842743 +0000 UTC m=+1459.210777764" watchObservedRunningTime="2026-02-19 19:42:39.598840793 +0000 UTC m=+1459.211191107" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.174890 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.329986 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55sdv\" (UniqueName: \"kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-kube-api-access-55sdv\") pod \"04e19f64-06d2-4c0e-b33c-000fea5deb27\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.330215 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-scripts\") pod \"04e19f64-06d2-4c0e-b33c-000fea5deb27\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.330252 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-combined-ca-bundle\") pod \"04e19f64-06d2-4c0e-b33c-000fea5deb27\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.330350 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-config-data\") pod \"04e19f64-06d2-4c0e-b33c-000fea5deb27\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.330406 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-certs\") pod \"04e19f64-06d2-4c0e-b33c-000fea5deb27\" (UID: \"04e19f64-06d2-4c0e-b33c-000fea5deb27\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.337294 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-scripts" (OuterVolumeSpecName: "scripts") pod "04e19f64-06d2-4c0e-b33c-000fea5deb27" (UID: "04e19f64-06d2-4c0e-b33c-000fea5deb27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.345311 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-kube-api-access-55sdv" (OuterVolumeSpecName: "kube-api-access-55sdv") pod "04e19f64-06d2-4c0e-b33c-000fea5deb27" (UID: "04e19f64-06d2-4c0e-b33c-000fea5deb27"). InnerVolumeSpecName "kube-api-access-55sdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.345542 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-certs" (OuterVolumeSpecName: "certs") pod "04e19f64-06d2-4c0e-b33c-000fea5deb27" (UID: "04e19f64-06d2-4c0e-b33c-000fea5deb27"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.368450 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04e19f64-06d2-4c0e-b33c-000fea5deb27" (UID: "04e19f64-06d2-4c0e-b33c-000fea5deb27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.371674 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-config-data" (OuterVolumeSpecName: "config-data") pod "04e19f64-06d2-4c0e-b33c-000fea5deb27" (UID: "04e19f64-06d2-4c0e-b33c-000fea5deb27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.432763 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.432822 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55sdv\" (UniqueName: \"kubernetes.io/projected/04e19f64-06d2-4c0e-b33c-000fea5deb27-kube-api-access-55sdv\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.432848 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.432869 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.432892 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e19f64-06d2-4c0e-b33c-000fea5deb27-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.584726 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-77rmn" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.584725 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-77rmn" event={"ID":"04e19f64-06d2-4c0e-b33c-000fea5deb27","Type":"ContainerDied","Data":"362971b9a3d43c23fcf4d469e680dc4a2402d33c6f74122b268ea70290fba5b3"} Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.584849 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="362971b9a3d43c23fcf4d469e680dc4a2402d33c6f74122b268ea70290fba5b3" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.587060 4722 generic.go:334] "Generic (PLEG): container finished" podID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" containerID="fa229a7bb206de4ccc0307a479f0fa815abfa412795902c84987eb4df94f0285" exitCode=0 Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.587131 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45","Type":"ContainerDied","Data":"fa229a7bb206de4ccc0307a479f0fa815abfa412795902c84987eb4df94f0285"} Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.592333 4722 generic.go:334] "Generic (PLEG): container finished" podID="75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" containerID="37cb328a31e79626446e5419a5f224da9c1e9f252a7b3a3099897e049cefbfc4" exitCode=0 Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.592399 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f","Type":"ContainerDied","Data":"37cb328a31e79626446e5419a5f224da9c1e9f252a7b3a3099897e049cefbfc4"} Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.592657 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.720197 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.841720 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.841865 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-config-data\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.841912 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-plugins\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.841940 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56t8r\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-kube-api-access-56t8r\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.841975 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-tls\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.842007 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-plugins-conf\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.842141 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-server-conf\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.842183 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-confd\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.842234 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-pod-info\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.842273 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-erlang-cookie-secret\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.842316 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-erlang-cookie\") pod \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\" (UID: \"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45\") " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.844994 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.845170 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.847211 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.853510 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-pod-info" (OuterVolumeSpecName: "pod-info") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.856141 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.857188 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.858411 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-kube-api-access-56t8r" (OuterVolumeSpecName: "kube-api-access-56t8r") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "kube-api-access-56t8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.885234 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda" (OuterVolumeSpecName: "persistence") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.887308 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-config-data" (OuterVolumeSpecName: "config-data") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.919452 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.926356 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-server-conf" (OuterVolumeSpecName: "server-conf") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.946878 4722 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.946912 4722 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.946926 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.946954 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") on node \"crc\" " Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.946969 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.946981 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.946993 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56t8r\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-kube-api-access-56t8r\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.947005 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.947016 4722 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:40 crc kubenswrapper[4722]: I0219 19:42:40.947025 4722 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.002221 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" (UID: "5d19e5cc-ef2a-4497-b45f-1a240fa1dd45"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.020961 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.021649 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda") on node "crc" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.047870 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-server-conf\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.047930 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-confd\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.048004 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-plugins\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.048036 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-config-data\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.048119 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-tls\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.048161 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-plugins-conf\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.049202 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.050717 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.050784 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-erlang-cookie\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.050822 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5nxh\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-kube-api-access-k5nxh\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.050915 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-pod-info\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.050973 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-erlang-cookie-secret\") pod \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\" (UID: \"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f\") " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.051275 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.051709 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.051732 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.051746 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.051762 4722 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.053504 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.058012 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.059425 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-kube-api-access-k5nxh" (OuterVolumeSpecName: "kube-api-access-k5nxh") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "kube-api-access-k5nxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.067677 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.069281 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-pod-info" (OuterVolumeSpecName: "pod-info") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.137534 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51" (OuterVolumeSpecName: "persistence") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "pvc-a5fb8482-d574-4930-864c-175c2bedef51". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.156073 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.156123 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a5fb8482-d574-4930-864c-175c2bedef51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") on node \"crc\" " Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.156137 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.156169 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5nxh\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-kube-api-access-k5nxh\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.156179 4722 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.156188 4722 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.165276 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-server-conf" (OuterVolumeSpecName: "server-conf") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.183338 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-config-data" (OuterVolumeSpecName: "config-data") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.258854 4722 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.258995 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.261412 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.262286 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a5fb8482-d574-4930-864c-175c2bedef51" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51") on node "crc" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.341838 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" (UID: "75733b46-d9a1-4cbe-b3ae-2dd39c98e54f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.361322 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-a5fb8482-d574-4930-864c-175c2bedef51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.361359 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.363034 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.363286 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="00bbae7e-ebc6-4102-9398-fc131546bbf5" containerName="cloudkitty-proc" containerID="cri-o://20d63437963fbb92aa14a89d0ac3100abcdfca03a493c9976283dbcbad9c2d7e" gracePeriod=30 Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.389592 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.389861 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="57386acb-6299-4fd3-80a2-25d8769dcc93" containerName="cloudkitty-api-log" containerID="cri-o://35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422" gracePeriod=30 Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.389964 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="57386acb-6299-4fd3-80a2-25d8769dcc93" containerName="cloudkitty-api" containerID="cri-o://fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971" gracePeriod=30 Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.610805 4722 generic.go:334] "Generic (PLEG): container finished" podID="57386acb-6299-4fd3-80a2-25d8769dcc93" containerID="35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422" exitCode=143 Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.610878 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"57386acb-6299-4fd3-80a2-25d8769dcc93","Type":"ContainerDied","Data":"35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422"} Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.614527 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5d19e5cc-ef2a-4497-b45f-1a240fa1dd45","Type":"ContainerDied","Data":"24279a6d2caf7ad4b1f181fa89124ed3ff752cfc1180df75df7a96c88d0345e2"} Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.614582 4722 scope.go:117] "RemoveContainer" containerID="fa229a7bb206de4ccc0307a479f0fa815abfa412795902c84987eb4df94f0285" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.614553 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.627415 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.627465 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"75733b46-d9a1-4cbe-b3ae-2dd39c98e54f","Type":"ContainerDied","Data":"3a2845abf856d9cafaeec46534beacb5f3f1990d5bed57b69cf295f8fe01e4f1"} Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.655426 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.670622 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.707369 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 19:42:41 crc kubenswrapper[4722]: E0219 19:42:41.707922 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" containerName="rabbitmq" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.707948 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" containerName="rabbitmq" Feb 19 19:42:41 crc kubenswrapper[4722]: E0219 19:42:41.707966 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" containerName="rabbitmq" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.707974 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" containerName="rabbitmq" Feb 19 19:42:41 crc kubenswrapper[4722]: E0219 19:42:41.708004 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" containerName="setup-container" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.708012 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" containerName="setup-container" Feb 19 19:42:41 crc kubenswrapper[4722]: E0219 19:42:41.708033 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e19f64-06d2-4c0e-b33c-000fea5deb27" containerName="cloudkitty-storageinit" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.708043 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e19f64-06d2-4c0e-b33c-000fea5deb27" containerName="cloudkitty-storageinit" Feb 19 19:42:41 crc kubenswrapper[4722]: E0219 19:42:41.708057 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" containerName="setup-container" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.708064 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" containerName="setup-container" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.708448 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" containerName="rabbitmq" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.708484 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="04e19f64-06d2-4c0e-b33c-000fea5deb27" containerName="cloudkitty-storageinit" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.708499 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" containerName="rabbitmq" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.716962 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.717212 4722 scope.go:117] "RemoveContainer" containerID="c749648f12e8840f28b25f37f34a53275ed4fc33d82900da005066210acf9af2" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.721248 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-cbm8q" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.721392 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.721528 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.721584 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.721748 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.747218 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.751852 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.752042 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769204 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769245 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769272 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f14785b-2e99-4110-9523-78ec32490e71-config-data\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769337 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f14785b-2e99-4110-9523-78ec32490e71-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769369 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f14785b-2e99-4110-9523-78ec32490e71-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769396 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769435 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f14785b-2e99-4110-9523-78ec32490e71-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769478 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vflz\" (UniqueName: \"kubernetes.io/projected/9f14785b-2e99-4110-9523-78ec32490e71-kube-api-access-9vflz\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769530 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f14785b-2e99-4110-9523-78ec32490e71-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769549 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.769577 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.796871 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.798800 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.798844 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.798884 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.802668 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5d87fcbd7a996e41ecc379a7fc5d8fec55b99f8916d82ec5d3e1bb7181cace17"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.802789 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://5d87fcbd7a996e41ecc379a7fc5d8fec55b99f8916d82ec5d3e1bb7181cace17" gracePeriod=600 Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.839758 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.863010 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.865276 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.869060 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.869516 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.869774 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.870055 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-qdf2m" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.871478 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f14785b-2e99-4110-9523-78ec32490e71-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.871534 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.871594 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.871668 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.871693 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.871731 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f14785b-2e99-4110-9523-78ec32490e71-config-data\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.871821 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f14785b-2e99-4110-9523-78ec32490e71-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.871871 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f14785b-2e99-4110-9523-78ec32490e71-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.871915 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.871984 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f14785b-2e99-4110-9523-78ec32490e71-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.872044 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vflz\" (UniqueName: \"kubernetes.io/projected/9f14785b-2e99-4110-9523-78ec32490e71-kube-api-access-9vflz\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.872477 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.872676 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.872868 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.873140 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.873814 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.874425 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9f14785b-2e99-4110-9523-78ec32490e71-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.877687 4722 scope.go:117] "RemoveContainer" containerID="37cb328a31e79626446e5419a5f224da9c1e9f252a7b3a3099897e049cefbfc4" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.877920 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9f14785b-2e99-4110-9523-78ec32490e71-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.878132 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9f14785b-2e99-4110-9523-78ec32490e71-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.879297 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f14785b-2e99-4110-9523-78ec32490e71-config-data\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.879867 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9f14785b-2e99-4110-9523-78ec32490e71-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.882875 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.896621 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.902750 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9f14785b-2e99-4110-9523-78ec32490e71-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.903710 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.903750 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5f6cee635ca5e2d348cf915d62a0dac8d2194b66bba55200fe901088eac3f7dd/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.921211 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vflz\" (UniqueName: \"kubernetes.io/projected/9f14785b-2e99-4110-9523-78ec32490e71-kube-api-access-9vflz\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.958550 4722 scope.go:117] "RemoveContainer" containerID="e127436a9b7fd84ddf258ebc3a3c64c5ddb9a7269490c5535eccdc44ec44422d" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.973571 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.973698 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.973753 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.973796 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.973834 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.973858 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a5fb8482-d574-4930-864c-175c2bedef51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.973885 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.973909 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.973947 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.974002 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jm4f\" (UniqueName: \"kubernetes.io/projected/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-kube-api-access-8jm4f\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:41 crc kubenswrapper[4722]: I0219 19:42:41.974084 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.016721 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f9764aaa-2750-42f6-8760-20cf0a0eceda\") pod \"rabbitmq-server-0\" (UID: \"9f14785b-2e99-4110-9523-78ec32490e71\") " pod="openstack/rabbitmq-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084298 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084391 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084426 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084447 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084469 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a5fb8482-d574-4930-864c-175c2bedef51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084505 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084524 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084546 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084583 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jm4f\" (UniqueName: \"kubernetes.io/projected/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-kube-api-access-8jm4f\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.084626 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.088585 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.089719 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.092929 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.092960 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.093961 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.093972 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.095866 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.096068 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.100596 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.112865 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jm4f\" (UniqueName: \"kubernetes.io/projected/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-kube-api-access-8jm4f\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.113757 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ac0e00c-0e1d-40fa-802d-8a77ac4c842b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.224878 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.224926 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a5fb8482-d574-4930-864c-175c2bedef51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6408a1f41ebba08884844654cc07aafa4a02aa7486293e45dd19f823f7662d43/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.328700 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a5fb8482-d574-4930-864c-175c2bedef51\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a5fb8482-d574-4930-864c-175c2bedef51\") pod \"rabbitmq-cell1-server-0\" (UID: \"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.516696 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.664784 4722 generic.go:334] "Generic (PLEG): container finished" podID="00bbae7e-ebc6-4102-9398-fc131546bbf5" containerID="20d63437963fbb92aa14a89d0ac3100abcdfca03a493c9976283dbcbad9c2d7e" exitCode=0 Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.665238 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"00bbae7e-ebc6-4102-9398-fc131546bbf5","Type":"ContainerDied","Data":"20d63437963fbb92aa14a89d0ac3100abcdfca03a493c9976283dbcbad9c2d7e"} Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.675435 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="5d87fcbd7a996e41ecc379a7fc5d8fec55b99f8916d82ec5d3e1bb7181cace17" exitCode=0 Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.675513 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"5d87fcbd7a996e41ecc379a7fc5d8fec55b99f8916d82ec5d3e1bb7181cace17"} Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.675538 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc"} Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.675566 4722 scope.go:117] "RemoveContainer" containerID="3f9ea5233c8da68a82202932b76beffc960ff77ead8fdc47e6fb7d01f484e9a5" Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.803034 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 19:42:42 crc kubenswrapper[4722]: I0219 19:42:42.963902 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.060563 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.085211 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d19e5cc-ef2a-4497-b45f-1a240fa1dd45" path="/var/lib/kubelet/pods/5d19e5cc-ef2a-4497-b45f-1a240fa1dd45/volumes" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.086164 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75733b46-d9a1-4cbe-b3ae-2dd39c98e54f" path="/var/lib/kubelet/pods/75733b46-d9a1-4cbe-b3ae-2dd39c98e54f/volumes" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.113284 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-certs\") pod \"00bbae7e-ebc6-4102-9398-fc131546bbf5\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.113392 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-combined-ca-bundle\") pod \"00bbae7e-ebc6-4102-9398-fc131546bbf5\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.113525 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-scripts\") pod \"00bbae7e-ebc6-4102-9398-fc131546bbf5\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.113589 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data-custom\") pod \"00bbae7e-ebc6-4102-9398-fc131546bbf5\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.113616 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data\") pod \"00bbae7e-ebc6-4102-9398-fc131546bbf5\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.113721 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw245\" (UniqueName: \"kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-kube-api-access-zw245\") pod \"00bbae7e-ebc6-4102-9398-fc131546bbf5\" (UID: \"00bbae7e-ebc6-4102-9398-fc131546bbf5\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.127516 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-scripts" (OuterVolumeSpecName: "scripts") pod "00bbae7e-ebc6-4102-9398-fc131546bbf5" (UID: "00bbae7e-ebc6-4102-9398-fc131546bbf5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.141358 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-certs" (OuterVolumeSpecName: "certs") pod "00bbae7e-ebc6-4102-9398-fc131546bbf5" (UID: "00bbae7e-ebc6-4102-9398-fc131546bbf5"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.142431 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "00bbae7e-ebc6-4102-9398-fc131546bbf5" (UID: "00bbae7e-ebc6-4102-9398-fc131546bbf5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.146582 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-kube-api-access-zw245" (OuterVolumeSpecName: "kube-api-access-zw245") pod "00bbae7e-ebc6-4102-9398-fc131546bbf5" (UID: "00bbae7e-ebc6-4102-9398-fc131546bbf5"). InnerVolumeSpecName "kube-api-access-zw245". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.176505 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data" (OuterVolumeSpecName: "config-data") pod "00bbae7e-ebc6-4102-9398-fc131546bbf5" (UID: "00bbae7e-ebc6-4102-9398-fc131546bbf5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.182401 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00bbae7e-ebc6-4102-9398-fc131546bbf5" (UID: "00bbae7e-ebc6-4102-9398-fc131546bbf5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.219814 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.219887 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.219901 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw245\" (UniqueName: \"kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-kube-api-access-zw245\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.219913 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/00bbae7e-ebc6-4102-9398-fc131546bbf5-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.219924 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.219934 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00bbae7e-ebc6-4102-9398-fc131546bbf5-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.337813 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-zzl74"] Feb 19 19:42:43 crc kubenswrapper[4722]: E0219 19:42:43.338502 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00bbae7e-ebc6-4102-9398-fc131546bbf5" containerName="cloudkitty-proc" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.338518 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="00bbae7e-ebc6-4102-9398-fc131546bbf5" containerName="cloudkitty-proc" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.338739 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="00bbae7e-ebc6-4102-9398-fc131546bbf5" containerName="cloudkitty-proc" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.339810 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.342007 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.356829 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-zzl74"] Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.529062 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-config\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.529143 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65zd2\" (UniqueName: \"kubernetes.io/projected/7a987597-e2e2-431d-9583-01f4dc2f4ecf-kube-api-access-65zd2\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.529287 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.529346 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.529488 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.529591 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.529623 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.594613 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.633182 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.633530 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.634948 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.635060 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.635198 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-config\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.635335 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65zd2\" (UniqueName: \"kubernetes.io/projected/7a987597-e2e2-431d-9583-01f4dc2f4ecf-kube-api-access-65zd2\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.635555 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.636408 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-svc\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.634366 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-swift-storage-0\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.637222 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-nb\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.634441 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-openstack-edpm-ipam\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.638005 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-sb\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.638759 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-config\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.667930 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65zd2\" (UniqueName: \"kubernetes.io/projected/7a987597-e2e2-431d-9583-01f4dc2f4ecf-kube-api-access-65zd2\") pod \"dnsmasq-dns-dbb88bf8c-zzl74\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.677744 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.694302 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.694310 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"00bbae7e-ebc6-4102-9398-fc131546bbf5","Type":"ContainerDied","Data":"fc84223d282573fb3fb01a61be1e1be06c5fe3404b335fffa4163163f1c67edb"} Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.694355 4722 scope.go:117] "RemoveContainer" containerID="20d63437963fbb92aa14a89d0ac3100abcdfca03a493c9976283dbcbad9c2d7e" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.722923 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b","Type":"ContainerStarted","Data":"bd0dba8d2d2388592c88639ec82f2ae2c7233392a83f9be2d726673271b2ec52"} Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.727641 4722 generic.go:334] "Generic (PLEG): container finished" podID="57386acb-6299-4fd3-80a2-25d8769dcc93" containerID="fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971" exitCode=0 Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.727713 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"57386acb-6299-4fd3-80a2-25d8769dcc93","Type":"ContainerDied","Data":"fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971"} Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.727741 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"57386acb-6299-4fd3-80a2-25d8769dcc93","Type":"ContainerDied","Data":"42972e824b9c0da7ea1f6a0d1a02b3318f196426677f7d390159e0bf2aae2802"} Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.727808 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.743904 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data-custom\") pod \"57386acb-6299-4fd3-80a2-25d8769dcc93\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.743987 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-internal-tls-certs\") pod \"57386acb-6299-4fd3-80a2-25d8769dcc93\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.744049 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br27h\" (UniqueName: \"kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-kube-api-access-br27h\") pod \"57386acb-6299-4fd3-80a2-25d8769dcc93\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.744082 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-public-tls-certs\") pod \"57386acb-6299-4fd3-80a2-25d8769dcc93\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.744134 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data\") pod \"57386acb-6299-4fd3-80a2-25d8769dcc93\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.744195 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-certs\") pod \"57386acb-6299-4fd3-80a2-25d8769dcc93\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.744218 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57386acb-6299-4fd3-80a2-25d8769dcc93-logs\") pod \"57386acb-6299-4fd3-80a2-25d8769dcc93\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.744310 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-combined-ca-bundle\") pod \"57386acb-6299-4fd3-80a2-25d8769dcc93\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.744356 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-scripts\") pod \"57386acb-6299-4fd3-80a2-25d8769dcc93\" (UID: \"57386acb-6299-4fd3-80a2-25d8769dcc93\") " Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.750872 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9f14785b-2e99-4110-9523-78ec32490e71","Type":"ContainerStarted","Data":"364ac221990d25f433c84726f172e8dd3fb628f012b85556817735eb2044da9f"} Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.752519 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57386acb-6299-4fd3-80a2-25d8769dcc93-logs" (OuterVolumeSpecName: "logs") pod "57386acb-6299-4fd3-80a2-25d8769dcc93" (UID: "57386acb-6299-4fd3-80a2-25d8769dcc93"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.762957 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.763656 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-certs" (OuterVolumeSpecName: "certs") pod "57386acb-6299-4fd3-80a2-25d8769dcc93" (UID: "57386acb-6299-4fd3-80a2-25d8769dcc93"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.764109 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-kube-api-access-br27h" (OuterVolumeSpecName: "kube-api-access-br27h") pod "57386acb-6299-4fd3-80a2-25d8769dcc93" (UID: "57386acb-6299-4fd3-80a2-25d8769dcc93"). InnerVolumeSpecName "kube-api-access-br27h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.764889 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "57386acb-6299-4fd3-80a2-25d8769dcc93" (UID: "57386acb-6299-4fd3-80a2-25d8769dcc93"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.775027 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.783724 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.783806 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-scripts" (OuterVolumeSpecName: "scripts") pod "57386acb-6299-4fd3-80a2-25d8769dcc93" (UID: "57386acb-6299-4fd3-80a2-25d8769dcc93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: E0219 19:42:43.784124 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57386acb-6299-4fd3-80a2-25d8769dcc93" containerName="cloudkitty-api-log" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.784142 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="57386acb-6299-4fd3-80a2-25d8769dcc93" containerName="cloudkitty-api-log" Feb 19 19:42:43 crc kubenswrapper[4722]: E0219 19:42:43.784162 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57386acb-6299-4fd3-80a2-25d8769dcc93" containerName="cloudkitty-api" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.784169 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="57386acb-6299-4fd3-80a2-25d8769dcc93" containerName="cloudkitty-api" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.784422 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="57386acb-6299-4fd3-80a2-25d8769dcc93" containerName="cloudkitty-api" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.784451 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="57386acb-6299-4fd3-80a2-25d8769dcc93" containerName="cloudkitty-api-log" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.785398 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.796914 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.799167 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.808858 4722 scope.go:117] "RemoveContainer" containerID="fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.846598 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.846635 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.846644 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br27h\" (UniqueName: \"kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-kube-api-access-br27h\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.846654 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/57386acb-6299-4fd3-80a2-25d8769dcc93-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.846665 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/57386acb-6299-4fd3-80a2-25d8769dcc93-logs\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.861926 4722 scope.go:117] "RemoveContainer" containerID="35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.908519 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data" (OuterVolumeSpecName: "config-data") pod "57386acb-6299-4fd3-80a2-25d8769dcc93" (UID: "57386acb-6299-4fd3-80a2-25d8769dcc93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.922419 4722 scope.go:117] "RemoveContainer" containerID="fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971" Feb 19 19:42:43 crc kubenswrapper[4722]: E0219 19:42:43.923911 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971\": container with ID starting with fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971 not found: ID does not exist" containerID="fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.923950 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971"} err="failed to get container status \"fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971\": rpc error: code = NotFound desc = could not find container \"fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971\": container with ID starting with fee2248587302f9e3e061ea0166113f711bcf0a137d00b7842c5b2cbd021f971 not found: ID does not exist" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.923977 4722 scope.go:117] "RemoveContainer" containerID="35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422" Feb 19 19:42:43 crc kubenswrapper[4722]: E0219 19:42:43.924341 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422\": container with ID starting with 35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422 not found: ID does not exist" containerID="35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.924374 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422"} err="failed to get container status \"35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422\": rpc error: code = NotFound desc = could not find container \"35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422\": container with ID starting with 35ad7d7d46733a6bef716557de2dfa0890b59231be314eb7efaac07557c78422 not found: ID does not exist" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.948560 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-config-data\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.948677 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-certs\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.948726 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-scripts\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.948809 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzb89\" (UniqueName: \"kubernetes.io/projected/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-kube-api-access-dzb89\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.949188 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.949225 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:43 crc kubenswrapper[4722]: I0219 19:42:43.949531 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.051439 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.051574 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-config-data\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.051608 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-certs\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.051642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-scripts\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.051672 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzb89\" (UniqueName: \"kubernetes.io/projected/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-kube-api-access-dzb89\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.051697 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.055309 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.055518 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.057311 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-certs\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.057760 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-scripts\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.113325 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-config-data\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.113857 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzb89\" (UniqueName: \"kubernetes.io/projected/0e52a6ab-57a4-4fd1-bd50-1832e756fc7f-kube-api-access-dzb89\") pod \"cloudkitty-proc-0\" (UID: \"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f\") " pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.147178 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.201244 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57386acb-6299-4fd3-80a2-25d8769dcc93" (UID: "57386acb-6299-4fd3-80a2-25d8769dcc93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.260636 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.333561 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-zzl74"] Feb 19 19:42:44 crc kubenswrapper[4722]: W0219 19:42:44.500706 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a987597_e2e2_431d_9583_01f4dc2f4ecf.slice/crio-4fa0562983df72acffc986b649c09943854e4927d1c687f922152874e0ab49dd WatchSource:0}: Error finding container 4fa0562983df72acffc986b649c09943854e4927d1c687f922152874e0ab49dd: Status 404 returned error can't find the container with id 4fa0562983df72acffc986b649c09943854e4927d1c687f922152874e0ab49dd Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.711795 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "57386acb-6299-4fd3-80a2-25d8769dcc93" (UID: "57386acb-6299-4fd3-80a2-25d8769dcc93"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.772052 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.779912 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" event={"ID":"7a987597-e2e2-431d-9583-01f4dc2f4ecf","Type":"ContainerStarted","Data":"4fa0562983df72acffc986b649c09943854e4927d1c687f922152874e0ab49dd"} Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.867312 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.943113 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "57386acb-6299-4fd3-80a2-25d8769dcc93" (UID: "57386acb-6299-4fd3-80a2-25d8769dcc93"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:42:44 crc kubenswrapper[4722]: I0219 19:42:44.981628 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57386acb-6299-4fd3-80a2-25d8769dcc93-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.082645 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00bbae7e-ebc6-4102-9398-fc131546bbf5" path="/var/lib/kubelet/pods/00bbae7e-ebc6-4102-9398-fc131546bbf5/volumes" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.341469 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.365002 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.376341 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.378664 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.382807 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.382906 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.384293 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.390362 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.497424 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckks6\" (UniqueName: \"kubernetes.io/projected/fc650d44-069f-41ed-b944-f1168dd5b25c-kube-api-access-ckks6\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.497751 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.497796 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.497884 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.497901 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-config-data\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.497957 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc650d44-069f-41ed-b944-f1168dd5b25c-logs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.498048 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.498295 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-scripts\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.498371 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fc650d44-069f-41ed-b944-f1168dd5b25c-certs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.601038 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-scripts\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.601163 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fc650d44-069f-41ed-b944-f1168dd5b25c-certs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.601241 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckks6\" (UniqueName: \"kubernetes.io/projected/fc650d44-069f-41ed-b944-f1168dd5b25c-kube-api-access-ckks6\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.601314 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.601355 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.601479 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.601507 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-config-data\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.601584 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc650d44-069f-41ed-b944-f1168dd5b25c-logs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.601653 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.608268 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-config-data\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.608356 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.610320 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-scripts\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.610657 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.610985 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc650d44-069f-41ed-b944-f1168dd5b25c-logs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.612631 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.612994 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/fc650d44-069f-41ed-b944-f1168dd5b25c-certs\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.613788 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc650d44-069f-41ed-b944-f1168dd5b25c-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.629240 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckks6\" (UniqueName: \"kubernetes.io/projected/fc650d44-069f-41ed-b944-f1168dd5b25c-kube-api-access-ckks6\") pod \"cloudkitty-api-0\" (UID: \"fc650d44-069f-41ed-b944-f1168dd5b25c\") " pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.694965 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.793638 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b","Type":"ContainerStarted","Data":"b95673087d8ea4b9a6a852d0c1a317e33ef78571ef0754777ff9655eec8f3615"} Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.797160 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f","Type":"ContainerStarted","Data":"2d1729315347d24f6c9daec53fbcd531b838e35e89ebdbe59edac0a29ffea465"} Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.797203 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"0e52a6ab-57a4-4fd1-bd50-1832e756fc7f","Type":"ContainerStarted","Data":"f7c0dc467d58658d6bd1e1d8711abeeba50228c0e752ea74443cd5f54500974d"} Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.800810 4722 generic.go:334] "Generic (PLEG): container finished" podID="7a987597-e2e2-431d-9583-01f4dc2f4ecf" containerID="82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8" exitCode=0 Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.801056 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" event={"ID":"7a987597-e2e2-431d-9583-01f4dc2f4ecf","Type":"ContainerDied","Data":"82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8"} Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.831223 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9f14785b-2e99-4110-9523-78ec32490e71","Type":"ContainerStarted","Data":"62d771eb1e8a20f3816db1e78f60944ccbfed3ae437139348353bf9a91656d8f"} Feb 19 19:42:45 crc kubenswrapper[4722]: I0219 19:42:45.916944 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.713768313 podStartE2EDuration="2.916920049s" podCreationTimestamp="2026-02-19 19:42:43 +0000 UTC" firstStartedPulling="2026-02-19 19:42:44.925192867 +0000 UTC m=+1464.537543191" lastFinishedPulling="2026-02-19 19:42:45.128344603 +0000 UTC m=+1464.740694927" observedRunningTime="2026-02-19 19:42:45.882484754 +0000 UTC m=+1465.494835078" watchObservedRunningTime="2026-02-19 19:42:45.916920049 +0000 UTC m=+1465.529270373" Feb 19 19:42:46 crc kubenswrapper[4722]: I0219 19:42:46.258650 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 19 19:42:46 crc kubenswrapper[4722]: I0219 19:42:46.842214 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"fc650d44-069f-41ed-b944-f1168dd5b25c","Type":"ContainerStarted","Data":"1605922eead9cb45e86e6bd7dfadb78e8a868ca729ce40782cda5360ddc6cb27"} Feb 19 19:42:46 crc kubenswrapper[4722]: I0219 19:42:46.842572 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"fc650d44-069f-41ed-b944-f1168dd5b25c","Type":"ContainerStarted","Data":"b95b47dca55bf6a59f7e571ae6a728815e128a589b41f6a4b9724ce08d5b5bfa"} Feb 19 19:42:46 crc kubenswrapper[4722]: I0219 19:42:46.842589 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"fc650d44-069f-41ed-b944-f1168dd5b25c","Type":"ContainerStarted","Data":"02aa0de2a5af9fdac494a793daf4a1810962c03e7c0554b5cacff7010d8b844d"} Feb 19 19:42:46 crc kubenswrapper[4722]: I0219 19:42:46.842629 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 19 19:42:46 crc kubenswrapper[4722]: I0219 19:42:46.844652 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" event={"ID":"7a987597-e2e2-431d-9583-01f4dc2f4ecf","Type":"ContainerStarted","Data":"a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4"} Feb 19 19:42:46 crc kubenswrapper[4722]: I0219 19:42:46.845430 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:46 crc kubenswrapper[4722]: I0219 19:42:46.876697 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=1.876682264 podStartE2EDuration="1.876682264s" podCreationTimestamp="2026-02-19 19:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:42:46.869358265 +0000 UTC m=+1466.481708589" watchObservedRunningTime="2026-02-19 19:42:46.876682264 +0000 UTC m=+1466.489032588" Feb 19 19:42:46 crc kubenswrapper[4722]: I0219 19:42:46.925294 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" podStartSLOduration=3.925273502 podStartE2EDuration="3.925273502s" podCreationTimestamp="2026-02-19 19:42:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:42:46.91336938 +0000 UTC m=+1466.525719704" watchObservedRunningTime="2026-02-19 19:42:46.925273502 +0000 UTC m=+1466.537623826" Feb 19 19:42:47 crc kubenswrapper[4722]: I0219 19:42:47.083970 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57386acb-6299-4fd3-80a2-25d8769dcc93" path="/var/lib/kubelet/pods/57386acb-6299-4fd3-80a2-25d8769dcc93/volumes" Feb 19 19:42:53 crc kubenswrapper[4722]: I0219 19:42:53.680295 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:42:53 crc kubenswrapper[4722]: I0219 19:42:53.748846 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-f59d8"] Feb 19 19:42:53 crc kubenswrapper[4722]: I0219 19:42:53.749497 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" podUID="dfcca6fc-5afb-464c-9852-3532ba5878a3" containerName="dnsmasq-dns" containerID="cri-o://07ae856c61611ad79b54a655cdc3c7aa79d812aa79705666cb7de6834474fefb" gracePeriod=10 Feb 19 19:42:53 crc kubenswrapper[4722]: I0219 19:42:53.871222 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-8zj5g"] Feb 19 19:42:53 crc kubenswrapper[4722]: I0219 19:42:53.873003 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:53 crc kubenswrapper[4722]: I0219 19:42:53.897935 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-8zj5g"] Feb 19 19:42:53 crc kubenswrapper[4722]: I0219 19:42:53.966872 4722 generic.go:334] "Generic (PLEG): container finished" podID="dfcca6fc-5afb-464c-9852-3532ba5878a3" containerID="07ae856c61611ad79b54a655cdc3c7aa79d812aa79705666cb7de6834474fefb" exitCode=0 Feb 19 19:42:53 crc kubenswrapper[4722]: I0219 19:42:53.966911 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" event={"ID":"dfcca6fc-5afb-464c-9852-3532ba5878a3","Type":"ContainerDied","Data":"07ae856c61611ad79b54a655cdc3c7aa79d812aa79705666cb7de6834474fefb"} Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.016792 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.016857 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-dns-svc\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.016973 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.017003 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.017049 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-config\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.017166 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.017366 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqc2q\" (UniqueName: \"kubernetes.io/projected/f6d970a0-c801-4472-a3b6-eccd8335d0a8-kube-api-access-zqc2q\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.120233 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.120626 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.120665 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-config\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.120689 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.120737 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqc2q\" (UniqueName: \"kubernetes.io/projected/f6d970a0-c801-4472-a3b6-eccd8335d0a8-kube-api-access-zqc2q\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.120822 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.120840 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-dns-svc\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.120980 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-openstack-edpm-ipam\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.121543 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-ovsdbserver-nb\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.122046 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-dns-svc\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.122051 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-ovsdbserver-sb\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.122944 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-config\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.123468 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6d970a0-c801-4472-a3b6-eccd8335d0a8-dns-swift-storage-0\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.156982 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqc2q\" (UniqueName: \"kubernetes.io/projected/f6d970a0-c801-4472-a3b6-eccd8335d0a8-kube-api-access-zqc2q\") pod \"dnsmasq-dns-85f64749dc-8zj5g\" (UID: \"f6d970a0-c801-4472-a3b6-eccd8335d0a8\") " pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.253579 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.389693 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.527883 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-nb\") pod \"dfcca6fc-5afb-464c-9852-3532ba5878a3\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.527964 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-svc\") pod \"dfcca6fc-5afb-464c-9852-3532ba5878a3\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.528006 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lfrj\" (UniqueName: \"kubernetes.io/projected/dfcca6fc-5afb-464c-9852-3532ba5878a3-kube-api-access-2lfrj\") pod \"dfcca6fc-5afb-464c-9852-3532ba5878a3\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.528051 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-config\") pod \"dfcca6fc-5afb-464c-9852-3532ba5878a3\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.528101 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-sb\") pod \"dfcca6fc-5afb-464c-9852-3532ba5878a3\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.528265 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-swift-storage-0\") pod \"dfcca6fc-5afb-464c-9852-3532ba5878a3\" (UID: \"dfcca6fc-5afb-464c-9852-3532ba5878a3\") " Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.533682 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfcca6fc-5afb-464c-9852-3532ba5878a3-kube-api-access-2lfrj" (OuterVolumeSpecName: "kube-api-access-2lfrj") pod "dfcca6fc-5afb-464c-9852-3532ba5878a3" (UID: "dfcca6fc-5afb-464c-9852-3532ba5878a3"). InnerVolumeSpecName "kube-api-access-2lfrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.582629 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-config" (OuterVolumeSpecName: "config") pod "dfcca6fc-5afb-464c-9852-3532ba5878a3" (UID: "dfcca6fc-5afb-464c-9852-3532ba5878a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.583579 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dfcca6fc-5afb-464c-9852-3532ba5878a3" (UID: "dfcca6fc-5afb-464c-9852-3532ba5878a3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.589980 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dfcca6fc-5afb-464c-9852-3532ba5878a3" (UID: "dfcca6fc-5afb-464c-9852-3532ba5878a3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.591121 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dfcca6fc-5afb-464c-9852-3532ba5878a3" (UID: "dfcca6fc-5afb-464c-9852-3532ba5878a3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.592774 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dfcca6fc-5afb-464c-9852-3532ba5878a3" (UID: "dfcca6fc-5afb-464c-9852-3532ba5878a3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.630822 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.630856 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.630866 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.630874 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lfrj\" (UniqueName: \"kubernetes.io/projected/dfcca6fc-5afb-464c-9852-3532ba5878a3-kube-api-access-2lfrj\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.630884 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.630891 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dfcca6fc-5afb-464c-9852-3532ba5878a3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.739715 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85f64749dc-8zj5g"] Feb 19 19:42:54 crc kubenswrapper[4722]: W0219 19:42:54.745109 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6d970a0_c801_4472_a3b6_eccd8335d0a8.slice/crio-6dba3c76ac69f9df6203910a0e2120ab2b67fa91f5982cbe596f045c1c055a6a WatchSource:0}: Error finding container 6dba3c76ac69f9df6203910a0e2120ab2b67fa91f5982cbe596f045c1c055a6a: Status 404 returned error can't find the container with id 6dba3c76ac69f9df6203910a0e2120ab2b67fa91f5982cbe596f045c1c055a6a Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.978055 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.978172 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-f59d8" event={"ID":"dfcca6fc-5afb-464c-9852-3532ba5878a3","Type":"ContainerDied","Data":"a7846c5aa72760b5fbf2419a5198a4a23f44068dc0e3a98cd281007d3f37f7b4"} Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.978616 4722 scope.go:117] "RemoveContainer" containerID="07ae856c61611ad79b54a655cdc3c7aa79d812aa79705666cb7de6834474fefb" Feb 19 19:42:54 crc kubenswrapper[4722]: I0219 19:42:54.979844 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" event={"ID":"f6d970a0-c801-4472-a3b6-eccd8335d0a8","Type":"ContainerStarted","Data":"6dba3c76ac69f9df6203910a0e2120ab2b67fa91f5982cbe596f045c1c055a6a"} Feb 19 19:42:55 crc kubenswrapper[4722]: I0219 19:42:55.005710 4722 scope.go:117] "RemoveContainer" containerID="2645caf8bc3502647b4c5a4dc4d97510df5ceb77697881dbc41661d5cae80579" Feb 19 19:42:55 crc kubenswrapper[4722]: I0219 19:42:55.011864 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-f59d8"] Feb 19 19:42:55 crc kubenswrapper[4722]: I0219 19:42:55.021396 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-f59d8"] Feb 19 19:42:55 crc kubenswrapper[4722]: I0219 19:42:55.086940 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfcca6fc-5afb-464c-9852-3532ba5878a3" path="/var/lib/kubelet/pods/dfcca6fc-5afb-464c-9852-3532ba5878a3/volumes" Feb 19 19:42:55 crc kubenswrapper[4722]: I0219 19:42:55.993116 4722 generic.go:334] "Generic (PLEG): container finished" podID="f6d970a0-c801-4472-a3b6-eccd8335d0a8" containerID="62081909198e335b4f853110f4fe5edc71f1a94287877a0e84078384c778ac69" exitCode=0 Feb 19 19:42:55 crc kubenswrapper[4722]: I0219 19:42:55.993535 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" event={"ID":"f6d970a0-c801-4472-a3b6-eccd8335d0a8","Type":"ContainerDied","Data":"62081909198e335b4f853110f4fe5edc71f1a94287877a0e84078384c778ac69"} Feb 19 19:42:57 crc kubenswrapper[4722]: I0219 19:42:57.010841 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" event={"ID":"f6d970a0-c801-4472-a3b6-eccd8335d0a8","Type":"ContainerStarted","Data":"a0203ad2b95afa13c3142f7de2f923065378b896fe80d57ec8368e46a4dd1048"} Feb 19 19:42:57 crc kubenswrapper[4722]: I0219 19:42:57.011217 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:42:57 crc kubenswrapper[4722]: I0219 19:42:57.056023 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" podStartSLOduration=4.056001721 podStartE2EDuration="4.056001721s" podCreationTimestamp="2026-02-19 19:42:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:42:57.042080226 +0000 UTC m=+1476.654430580" watchObservedRunningTime="2026-02-19 19:42:57.056001721 +0000 UTC m=+1476.668352055" Feb 19 19:43:00 crc kubenswrapper[4722]: I0219 19:43:00.887375 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.256078 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85f64749dc-8zj5g" Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.323551 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-zzl74"] Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.323802 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" podUID="7a987597-e2e2-431d-9583-01f4dc2f4ecf" containerName="dnsmasq-dns" containerID="cri-o://a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4" gracePeriod=10 Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.886206 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.984770 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-config\") pod \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.984892 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-nb\") pod \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.985005 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-svc\") pod \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.985120 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-sb\") pod \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.985189 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65zd2\" (UniqueName: \"kubernetes.io/projected/7a987597-e2e2-431d-9583-01f4dc2f4ecf-kube-api-access-65zd2\") pod \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.985233 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-swift-storage-0\") pod \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.985277 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-openstack-edpm-ipam\") pod \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\" (UID: \"7a987597-e2e2-431d-9583-01f4dc2f4ecf\") " Feb 19 19:43:04 crc kubenswrapper[4722]: I0219 19:43:04.993500 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a987597-e2e2-431d-9583-01f4dc2f4ecf-kube-api-access-65zd2" (OuterVolumeSpecName: "kube-api-access-65zd2") pod "7a987597-e2e2-431d-9583-01f4dc2f4ecf" (UID: "7a987597-e2e2-431d-9583-01f4dc2f4ecf"). InnerVolumeSpecName "kube-api-access-65zd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.041542 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a987597-e2e2-431d-9583-01f4dc2f4ecf" (UID: "7a987597-e2e2-431d-9583-01f4dc2f4ecf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.041555 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7a987597-e2e2-431d-9583-01f4dc2f4ecf" (UID: "7a987597-e2e2-431d-9583-01f4dc2f4ecf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.054471 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-config" (OuterVolumeSpecName: "config") pod "7a987597-e2e2-431d-9583-01f4dc2f4ecf" (UID: "7a987597-e2e2-431d-9583-01f4dc2f4ecf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.054799 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "7a987597-e2e2-431d-9583-01f4dc2f4ecf" (UID: "7a987597-e2e2-431d-9583-01f4dc2f4ecf"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.055956 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7a987597-e2e2-431d-9583-01f4dc2f4ecf" (UID: "7a987597-e2e2-431d-9583-01f4dc2f4ecf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.062010 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7a987597-e2e2-431d-9583-01f4dc2f4ecf" (UID: "7a987597-e2e2-431d-9583-01f4dc2f4ecf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.087176 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.087201 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.087211 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65zd2\" (UniqueName: \"kubernetes.io/projected/7a987597-e2e2-431d-9583-01f4dc2f4ecf-kube-api-access-65zd2\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.087219 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.087227 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.087236 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-config\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.087244 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a987597-e2e2-431d-9583-01f4dc2f4ecf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.110218 4722 generic.go:334] "Generic (PLEG): container finished" podID="7a987597-e2e2-431d-9583-01f4dc2f4ecf" containerID="a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4" exitCode=0 Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.110259 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" event={"ID":"7a987597-e2e2-431d-9583-01f4dc2f4ecf","Type":"ContainerDied","Data":"a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4"} Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.110291 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" event={"ID":"7a987597-e2e2-431d-9583-01f4dc2f4ecf","Type":"ContainerDied","Data":"4fa0562983df72acffc986b649c09943854e4927d1c687f922152874e0ab49dd"} Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.110316 4722 scope.go:117] "RemoveContainer" containerID="a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.110472 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dbb88bf8c-zzl74" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.151085 4722 scope.go:117] "RemoveContainer" containerID="82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.153624 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-zzl74"] Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.169641 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dbb88bf8c-zzl74"] Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.183319 4722 scope.go:117] "RemoveContainer" containerID="a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4" Feb 19 19:43:05 crc kubenswrapper[4722]: E0219 19:43:05.183727 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4\": container with ID starting with a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4 not found: ID does not exist" containerID="a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.183787 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4"} err="failed to get container status \"a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4\": rpc error: code = NotFound desc = could not find container \"a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4\": container with ID starting with a7885a552a724dfe58d47db8f95893f29969ec81481c63ebf2514f7452625fb4 not found: ID does not exist" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.183812 4722 scope.go:117] "RemoveContainer" containerID="82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8" Feb 19 19:43:05 crc kubenswrapper[4722]: E0219 19:43:05.184308 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8\": container with ID starting with 82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8 not found: ID does not exist" containerID="82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8" Feb 19 19:43:05 crc kubenswrapper[4722]: I0219 19:43:05.184366 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8"} err="failed to get container status \"82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8\": rpc error: code = NotFound desc = could not find container \"82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8\": container with ID starting with 82740bb33b8ebf0f229b546299b6020b79c832469ce6108dd60a8ad2b92365a8 not found: ID does not exist" Feb 19 19:43:07 crc kubenswrapper[4722]: I0219 19:43:07.086041 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a987597-e2e2-431d-9583-01f4dc2f4ecf" path="/var/lib/kubelet/pods/7a987597-e2e2-431d-9583-01f4dc2f4ecf/volumes" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.273515 4722 generic.go:334] "Generic (PLEG): container finished" podID="9f14785b-2e99-4110-9523-78ec32490e71" containerID="62d771eb1e8a20f3816db1e78f60944ccbfed3ae437139348353bf9a91656d8f" exitCode=0 Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.273618 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9f14785b-2e99-4110-9523-78ec32490e71","Type":"ContainerDied","Data":"62d771eb1e8a20f3816db1e78f60944ccbfed3ae437139348353bf9a91656d8f"} Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.277707 4722 generic.go:334] "Generic (PLEG): container finished" podID="9ac0e00c-0e1d-40fa-802d-8a77ac4c842b" containerID="b95673087d8ea4b9a6a852d0c1a317e33ef78571ef0754777ff9655eec8f3615" exitCode=0 Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.277772 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b","Type":"ContainerDied","Data":"b95673087d8ea4b9a6a852d0c1a317e33ef78571ef0754777ff9655eec8f3615"} Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.537739 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx"] Feb 19 19:43:17 crc kubenswrapper[4722]: E0219 19:43:17.538614 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a987597-e2e2-431d-9583-01f4dc2f4ecf" containerName="init" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.538634 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a987597-e2e2-431d-9583-01f4dc2f4ecf" containerName="init" Feb 19 19:43:17 crc kubenswrapper[4722]: E0219 19:43:17.538650 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfcca6fc-5afb-464c-9852-3532ba5878a3" containerName="dnsmasq-dns" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.538657 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfcca6fc-5afb-464c-9852-3532ba5878a3" containerName="dnsmasq-dns" Feb 19 19:43:17 crc kubenswrapper[4722]: E0219 19:43:17.538695 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a987597-e2e2-431d-9583-01f4dc2f4ecf" containerName="dnsmasq-dns" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.538703 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a987597-e2e2-431d-9583-01f4dc2f4ecf" containerName="dnsmasq-dns" Feb 19 19:43:17 crc kubenswrapper[4722]: E0219 19:43:17.538731 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfcca6fc-5afb-464c-9852-3532ba5878a3" containerName="init" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.538739 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfcca6fc-5afb-464c-9852-3532ba5878a3" containerName="init" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.538993 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfcca6fc-5afb-464c-9852-3532ba5878a3" containerName="dnsmasq-dns" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.539011 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a987597-e2e2-431d-9583-01f4dc2f4ecf" containerName="dnsmasq-dns" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.546950 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.551602 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.551807 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.552010 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.555710 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx"] Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.556450 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.611904 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.612102 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.612171 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.612228 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jbwd\" (UniqueName: \"kubernetes.io/projected/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-kube-api-access-6jbwd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.713868 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.714447 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jbwd\" (UniqueName: \"kubernetes.io/projected/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-kube-api-access-6jbwd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.714621 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.714683 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.728628 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.729133 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.730886 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.741591 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jbwd\" (UniqueName: \"kubernetes.io/projected/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-kube-api-access-6jbwd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:17 crc kubenswrapper[4722]: I0219 19:43:17.897888 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:18 crc kubenswrapper[4722]: I0219 19:43:18.325786 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9f14785b-2e99-4110-9523-78ec32490e71","Type":"ContainerStarted","Data":"b8e281ed8c18d780a9dc36ca1a7967bf4234e515374da1f7e0eb97a781e463de"} Feb 19 19:43:18 crc kubenswrapper[4722]: I0219 19:43:18.327470 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 19:43:18 crc kubenswrapper[4722]: I0219 19:43:18.335896 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9ac0e00c-0e1d-40fa-802d-8a77ac4c842b","Type":"ContainerStarted","Data":"18603bed4e5f7e71e772012ca13ea8f29124d9e40c05dc8079c98faf9d74aa51"} Feb 19 19:43:18 crc kubenswrapper[4722]: I0219 19:43:18.336123 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:43:18 crc kubenswrapper[4722]: I0219 19:43:18.365880 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.365856455 podStartE2EDuration="37.365856455s" podCreationTimestamp="2026-02-19 19:42:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:43:18.348669828 +0000 UTC m=+1497.961020152" watchObservedRunningTime="2026-02-19 19:43:18.365856455 +0000 UTC m=+1497.978206779" Feb 19 19:43:18 crc kubenswrapper[4722]: I0219 19:43:18.374920 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.374905277 podStartE2EDuration="37.374905277s" podCreationTimestamp="2026-02-19 19:42:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:43:18.368351573 +0000 UTC m=+1497.980701917" watchObservedRunningTime="2026-02-19 19:43:18.374905277 +0000 UTC m=+1497.987255601" Feb 19 19:43:18 crc kubenswrapper[4722]: I0219 19:43:18.461747 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx"] Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.199274 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lfr4g"] Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.201453 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.207970 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lfr4g"] Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.348299 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" event={"ID":"78d0d06a-2199-4c5c-99e9-5bf916d8f30e","Type":"ContainerStarted","Data":"80ec107254c616b2d4c87f564f37922dd485c1c585d1ed01ad5292b221ec5dfb"} Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.349648 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-catalog-content\") pod \"community-operators-lfr4g\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.349733 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-utilities\") pod \"community-operators-lfr4g\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.350505 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbbt2\" (UniqueName: \"kubernetes.io/projected/03661e8e-c7dc-4b7a-b463-8ef04af17523-kube-api-access-wbbt2\") pod \"community-operators-lfr4g\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.451773 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbbt2\" (UniqueName: \"kubernetes.io/projected/03661e8e-c7dc-4b7a-b463-8ef04af17523-kube-api-access-wbbt2\") pod \"community-operators-lfr4g\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.451954 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-catalog-content\") pod \"community-operators-lfr4g\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.451999 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-utilities\") pod \"community-operators-lfr4g\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.452297 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-catalog-content\") pod \"community-operators-lfr4g\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.452478 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-utilities\") pod \"community-operators-lfr4g\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.474348 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbbt2\" (UniqueName: \"kubernetes.io/projected/03661e8e-c7dc-4b7a-b463-8ef04af17523-kube-api-access-wbbt2\") pod \"community-operators-lfr4g\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:19 crc kubenswrapper[4722]: I0219 19:43:19.566846 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:20 crc kubenswrapper[4722]: I0219 19:43:20.060772 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lfr4g"] Feb 19 19:43:20 crc kubenswrapper[4722]: I0219 19:43:20.366876 4722 generic.go:334] "Generic (PLEG): container finished" podID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerID="9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316" exitCode=0 Feb 19 19:43:20 crc kubenswrapper[4722]: I0219 19:43:20.367020 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfr4g" event={"ID":"03661e8e-c7dc-4b7a-b463-8ef04af17523","Type":"ContainerDied","Data":"9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316"} Feb 19 19:43:20 crc kubenswrapper[4722]: I0219 19:43:20.367097 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfr4g" event={"ID":"03661e8e-c7dc-4b7a-b463-8ef04af17523","Type":"ContainerStarted","Data":"1e1f79d00b654bd29d0af3c012deee5c83d982006853865e13db335a0839341c"} Feb 19 19:43:22 crc kubenswrapper[4722]: I0219 19:43:22.719291 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Feb 19 19:43:23 crc kubenswrapper[4722]: I0219 19:43:23.408979 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfr4g" event={"ID":"03661e8e-c7dc-4b7a-b463-8ef04af17523","Type":"ContainerStarted","Data":"74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb"} Feb 19 19:43:25 crc kubenswrapper[4722]: I0219 19:43:25.433390 4722 generic.go:334] "Generic (PLEG): container finished" podID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerID="74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb" exitCode=0 Feb 19 19:43:25 crc kubenswrapper[4722]: I0219 19:43:25.433652 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfr4g" event={"ID":"03661e8e-c7dc-4b7a-b463-8ef04af17523","Type":"ContainerDied","Data":"74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb"} Feb 19 19:43:28 crc kubenswrapper[4722]: I0219 19:43:28.616192 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:43:29 crc kubenswrapper[4722]: I0219 19:43:29.477071 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" event={"ID":"78d0d06a-2199-4c5c-99e9-5bf916d8f30e","Type":"ContainerStarted","Data":"0df2affc7967aa9fbc1883fd8ed8d42f351642cd08abaac174535f8af0673d64"} Feb 19 19:43:29 crc kubenswrapper[4722]: I0219 19:43:29.479738 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfr4g" event={"ID":"03661e8e-c7dc-4b7a-b463-8ef04af17523","Type":"ContainerStarted","Data":"9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f"} Feb 19 19:43:29 crc kubenswrapper[4722]: I0219 19:43:29.498106 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" podStartSLOduration=2.349120012 podStartE2EDuration="12.498087112s" podCreationTimestamp="2026-02-19 19:43:17 +0000 UTC" firstStartedPulling="2026-02-19 19:43:18.464567408 +0000 UTC m=+1498.076917742" lastFinishedPulling="2026-02-19 19:43:28.613534518 +0000 UTC m=+1508.225884842" observedRunningTime="2026-02-19 19:43:29.490387972 +0000 UTC m=+1509.102738296" watchObservedRunningTime="2026-02-19 19:43:29.498087112 +0000 UTC m=+1509.110437436" Feb 19 19:43:29 crc kubenswrapper[4722]: I0219 19:43:29.523017 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lfr4g" podStartSLOduration=3.335626735 podStartE2EDuration="10.52299515s" podCreationTimestamp="2026-02-19 19:43:19 +0000 UTC" firstStartedPulling="2026-02-19 19:43:21.704830479 +0000 UTC m=+1501.317180803" lastFinishedPulling="2026-02-19 19:43:28.892198894 +0000 UTC m=+1508.504549218" observedRunningTime="2026-02-19 19:43:29.522044841 +0000 UTC m=+1509.134395165" watchObservedRunningTime="2026-02-19 19:43:29.52299515 +0000 UTC m=+1509.135345474" Feb 19 19:43:29 crc kubenswrapper[4722]: I0219 19:43:29.568256 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:29 crc kubenswrapper[4722]: I0219 19:43:29.568394 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:30 crc kubenswrapper[4722]: I0219 19:43:30.616359 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-lfr4g" podUID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerName="registry-server" probeResult="failure" output=< Feb 19 19:43:30 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 19 19:43:30 crc kubenswrapper[4722]: > Feb 19 19:43:32 crc kubenswrapper[4722]: I0219 19:43:32.098377 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 19:43:32 crc kubenswrapper[4722]: I0219 19:43:32.518302 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 19:43:35 crc kubenswrapper[4722]: I0219 19:43:35.132574 4722 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod7a987597-e2e2-431d-9583-01f4dc2f4ecf"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod7a987597-e2e2-431d-9583-01f4dc2f4ecf] : Timed out while waiting for systemd to remove kubepods-besteffort-pod7a987597_e2e2_431d_9583_01f4dc2f4ecf.slice" Feb 19 19:43:39 crc kubenswrapper[4722]: I0219 19:43:39.630128 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:39 crc kubenswrapper[4722]: I0219 19:43:39.689598 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:39 crc kubenswrapper[4722]: I0219 19:43:39.861972 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lfr4g"] Feb 19 19:43:41 crc kubenswrapper[4722]: I0219 19:43:41.614715 4722 generic.go:334] "Generic (PLEG): container finished" podID="78d0d06a-2199-4c5c-99e9-5bf916d8f30e" containerID="0df2affc7967aa9fbc1883fd8ed8d42f351642cd08abaac174535f8af0673d64" exitCode=0 Feb 19 19:43:41 crc kubenswrapper[4722]: I0219 19:43:41.614798 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" event={"ID":"78d0d06a-2199-4c5c-99e9-5bf916d8f30e","Type":"ContainerDied","Data":"0df2affc7967aa9fbc1883fd8ed8d42f351642cd08abaac174535f8af0673d64"} Feb 19 19:43:41 crc kubenswrapper[4722]: I0219 19:43:41.615183 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lfr4g" podUID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerName="registry-server" containerID="cri-o://9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f" gracePeriod=2 Feb 19 19:43:41 crc kubenswrapper[4722]: I0219 19:43:41.923510 4722 scope.go:117] "RemoveContainer" containerID="ec6e9a5d8db1ce9bec823742a602001b48238109f03304859ab2fe4f5a1aeb10" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.044690 4722 scope.go:117] "RemoveContainer" containerID="044ff08c5dbbd2f41c731beab45cb688557289abbb1920032c7fa0385f11e9f7" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.147335 4722 scope.go:117] "RemoveContainer" containerID="62cc34e349902eca38fc94fdcd77006a8905ea0cb9cbb3392c7d1c40da4629fc" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.328062 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.414907 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-utilities\") pod \"03661e8e-c7dc-4b7a-b463-8ef04af17523\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.415138 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-catalog-content\") pod \"03661e8e-c7dc-4b7a-b463-8ef04af17523\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.415203 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbbt2\" (UniqueName: \"kubernetes.io/projected/03661e8e-c7dc-4b7a-b463-8ef04af17523-kube-api-access-wbbt2\") pod \"03661e8e-c7dc-4b7a-b463-8ef04af17523\" (UID: \"03661e8e-c7dc-4b7a-b463-8ef04af17523\") " Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.416027 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-utilities" (OuterVolumeSpecName: "utilities") pod "03661e8e-c7dc-4b7a-b463-8ef04af17523" (UID: "03661e8e-c7dc-4b7a-b463-8ef04af17523"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.421031 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03661e8e-c7dc-4b7a-b463-8ef04af17523-kube-api-access-wbbt2" (OuterVolumeSpecName: "kube-api-access-wbbt2") pod "03661e8e-c7dc-4b7a-b463-8ef04af17523" (UID: "03661e8e-c7dc-4b7a-b463-8ef04af17523"). InnerVolumeSpecName "kube-api-access-wbbt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.466402 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03661e8e-c7dc-4b7a-b463-8ef04af17523" (UID: "03661e8e-c7dc-4b7a-b463-8ef04af17523"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.517262 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.517290 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbbt2\" (UniqueName: \"kubernetes.io/projected/03661e8e-c7dc-4b7a-b463-8ef04af17523-kube-api-access-wbbt2\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.517301 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03661e8e-c7dc-4b7a-b463-8ef04af17523-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.636395 4722 generic.go:334] "Generic (PLEG): container finished" podID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerID="9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f" exitCode=0 Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.637636 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfr4g" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.638010 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfr4g" event={"ID":"03661e8e-c7dc-4b7a-b463-8ef04af17523","Type":"ContainerDied","Data":"9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f"} Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.638064 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfr4g" event={"ID":"03661e8e-c7dc-4b7a-b463-8ef04af17523","Type":"ContainerDied","Data":"1e1f79d00b654bd29d0af3c012deee5c83d982006853865e13db335a0839341c"} Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.638096 4722 scope.go:117] "RemoveContainer" containerID="9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.671338 4722 scope.go:117] "RemoveContainer" containerID="74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.687861 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lfr4g"] Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.700355 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lfr4g"] Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.712337 4722 scope.go:117] "RemoveContainer" containerID="9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.742797 4722 scope.go:117] "RemoveContainer" containerID="9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f" Feb 19 19:43:42 crc kubenswrapper[4722]: E0219 19:43:42.743744 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f\": container with ID starting with 9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f not found: ID does not exist" containerID="9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.743786 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f"} err="failed to get container status \"9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f\": rpc error: code = NotFound desc = could not find container \"9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f\": container with ID starting with 9c6dac0cc6ad0518b4a0c924c85f5dc54550d5622866c130091bd36d91d75e1f not found: ID does not exist" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.743814 4722 scope.go:117] "RemoveContainer" containerID="74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb" Feb 19 19:43:42 crc kubenswrapper[4722]: E0219 19:43:42.744231 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb\": container with ID starting with 74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb not found: ID does not exist" containerID="74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.744272 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb"} err="failed to get container status \"74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb\": rpc error: code = NotFound desc = could not find container \"74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb\": container with ID starting with 74220fc8cf353c4508bbec3653a5fb74dea00493898e2bba0a6266da7badeaeb not found: ID does not exist" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.744305 4722 scope.go:117] "RemoveContainer" containerID="9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316" Feb 19 19:43:42 crc kubenswrapper[4722]: E0219 19:43:42.744750 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316\": container with ID starting with 9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316 not found: ID does not exist" containerID="9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316" Feb 19 19:43:42 crc kubenswrapper[4722]: I0219 19:43:42.744783 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316"} err="failed to get container status \"9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316\": rpc error: code = NotFound desc = could not find container \"9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316\": container with ID starting with 9a69081cca0548c6e2e5c1fcfac6d0fa5d9b99b9606b69448a9ab53bf13f2316 not found: ID does not exist" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.083348 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03661e8e-c7dc-4b7a-b463-8ef04af17523" path="/var/lib/kubelet/pods/03661e8e-c7dc-4b7a-b463-8ef04af17523/volumes" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.149086 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.255310 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-repo-setup-combined-ca-bundle\") pod \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.255598 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jbwd\" (UniqueName: \"kubernetes.io/projected/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-kube-api-access-6jbwd\") pod \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.255697 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-inventory\") pod \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.255758 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-ssh-key-openstack-edpm-ipam\") pod \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\" (UID: \"78d0d06a-2199-4c5c-99e9-5bf916d8f30e\") " Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.260689 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-kube-api-access-6jbwd" (OuterVolumeSpecName: "kube-api-access-6jbwd") pod "78d0d06a-2199-4c5c-99e9-5bf916d8f30e" (UID: "78d0d06a-2199-4c5c-99e9-5bf916d8f30e"). InnerVolumeSpecName "kube-api-access-6jbwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.262379 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "78d0d06a-2199-4c5c-99e9-5bf916d8f30e" (UID: "78d0d06a-2199-4c5c-99e9-5bf916d8f30e"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.291106 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "78d0d06a-2199-4c5c-99e9-5bf916d8f30e" (UID: "78d0d06a-2199-4c5c-99e9-5bf916d8f30e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.305277 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-inventory" (OuterVolumeSpecName: "inventory") pod "78d0d06a-2199-4c5c-99e9-5bf916d8f30e" (UID: "78d0d06a-2199-4c5c-99e9-5bf916d8f30e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.358404 4722 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.358452 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jbwd\" (UniqueName: \"kubernetes.io/projected/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-kube-api-access-6jbwd\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.358468 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.358480 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78d0d06a-2199-4c5c-99e9-5bf916d8f30e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.656695 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" event={"ID":"78d0d06a-2199-4c5c-99e9-5bf916d8f30e","Type":"ContainerDied","Data":"80ec107254c616b2d4c87f564f37922dd485c1c585d1ed01ad5292b221ec5dfb"} Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.656746 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80ec107254c616b2d4c87f564f37922dd485c1c585d1ed01ad5292b221ec5dfb" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.656713 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.719827 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52"] Feb 19 19:43:43 crc kubenswrapper[4722]: E0219 19:43:43.720221 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerName="extract-content" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.720238 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerName="extract-content" Feb 19 19:43:43 crc kubenswrapper[4722]: E0219 19:43:43.720250 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerName="extract-utilities" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.720258 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerName="extract-utilities" Feb 19 19:43:43 crc kubenswrapper[4722]: E0219 19:43:43.720278 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerName="registry-server" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.720284 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerName="registry-server" Feb 19 19:43:43 crc kubenswrapper[4722]: E0219 19:43:43.720313 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d0d06a-2199-4c5c-99e9-5bf916d8f30e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.720321 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d0d06a-2199-4c5c-99e9-5bf916d8f30e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.720500 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="78d0d06a-2199-4c5c-99e9-5bf916d8f30e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.720524 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="03661e8e-c7dc-4b7a-b463-8ef04af17523" containerName="registry-server" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.721264 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.731818 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52"] Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.733387 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.733678 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.733867 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.739102 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.872660 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2hf52\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.872927 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clwl9\" (UniqueName: \"kubernetes.io/projected/d2554051-f8a8-413e-b352-13ac8f88da63-kube-api-access-clwl9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2hf52\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.873069 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2hf52\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.975608 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2hf52\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.975669 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clwl9\" (UniqueName: \"kubernetes.io/projected/d2554051-f8a8-413e-b352-13ac8f88da63-kube-api-access-clwl9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2hf52\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.975693 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2hf52\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.979122 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2hf52\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:43 crc kubenswrapper[4722]: I0219 19:43:43.979678 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2hf52\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:44 crc kubenswrapper[4722]: I0219 19:43:44.000376 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clwl9\" (UniqueName: \"kubernetes.io/projected/d2554051-f8a8-413e-b352-13ac8f88da63-kube-api-access-clwl9\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-2hf52\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:44 crc kubenswrapper[4722]: I0219 19:43:44.096776 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:44 crc kubenswrapper[4722]: I0219 19:43:44.629609 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52"] Feb 19 19:43:44 crc kubenswrapper[4722]: W0219 19:43:44.638027 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2554051_f8a8_413e_b352_13ac8f88da63.slice/crio-3b05bb912a720c291d17fc2bbbe50e4aac5ac97aed9f966b3b071024a551d0de WatchSource:0}: Error finding container 3b05bb912a720c291d17fc2bbbe50e4aac5ac97aed9f966b3b071024a551d0de: Status 404 returned error can't find the container with id 3b05bb912a720c291d17fc2bbbe50e4aac5ac97aed9f966b3b071024a551d0de Feb 19 19:43:44 crc kubenswrapper[4722]: I0219 19:43:44.672733 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" event={"ID":"d2554051-f8a8-413e-b352-13ac8f88da63","Type":"ContainerStarted","Data":"3b05bb912a720c291d17fc2bbbe50e4aac5ac97aed9f966b3b071024a551d0de"} Feb 19 19:43:45 crc kubenswrapper[4722]: I0219 19:43:45.724971 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" event={"ID":"d2554051-f8a8-413e-b352-13ac8f88da63","Type":"ContainerStarted","Data":"20998a5e9b0bbea9ed5fc65c67fcd47b1526c1aa9ab0f5cb2c3accf655d0120e"} Feb 19 19:43:45 crc kubenswrapper[4722]: I0219 19:43:45.753643 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" podStartSLOduration=2.358406764 podStartE2EDuration="2.753622511s" podCreationTimestamp="2026-02-19 19:43:43 +0000 UTC" firstStartedPulling="2026-02-19 19:43:44.641000141 +0000 UTC m=+1524.253350465" lastFinishedPulling="2026-02-19 19:43:45.036215878 +0000 UTC m=+1524.648566212" observedRunningTime="2026-02-19 19:43:45.750195834 +0000 UTC m=+1525.362546188" watchObservedRunningTime="2026-02-19 19:43:45.753622511 +0000 UTC m=+1525.365972835" Feb 19 19:43:47 crc kubenswrapper[4722]: I0219 19:43:47.748741 4722 generic.go:334] "Generic (PLEG): container finished" podID="d2554051-f8a8-413e-b352-13ac8f88da63" containerID="20998a5e9b0bbea9ed5fc65c67fcd47b1526c1aa9ab0f5cb2c3accf655d0120e" exitCode=0 Feb 19 19:43:47 crc kubenswrapper[4722]: I0219 19:43:47.748855 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" event={"ID":"d2554051-f8a8-413e-b352-13ac8f88da63","Type":"ContainerDied","Data":"20998a5e9b0bbea9ed5fc65c67fcd47b1526c1aa9ab0f5cb2c3accf655d0120e"} Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.282432 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.393019 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clwl9\" (UniqueName: \"kubernetes.io/projected/d2554051-f8a8-413e-b352-13ac8f88da63-kube-api-access-clwl9\") pod \"d2554051-f8a8-413e-b352-13ac8f88da63\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.393190 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-inventory\") pod \"d2554051-f8a8-413e-b352-13ac8f88da63\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.393242 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-ssh-key-openstack-edpm-ipam\") pod \"d2554051-f8a8-413e-b352-13ac8f88da63\" (UID: \"d2554051-f8a8-413e-b352-13ac8f88da63\") " Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.399076 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2554051-f8a8-413e-b352-13ac8f88da63-kube-api-access-clwl9" (OuterVolumeSpecName: "kube-api-access-clwl9") pod "d2554051-f8a8-413e-b352-13ac8f88da63" (UID: "d2554051-f8a8-413e-b352-13ac8f88da63"). InnerVolumeSpecName "kube-api-access-clwl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.424624 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-inventory" (OuterVolumeSpecName: "inventory") pod "d2554051-f8a8-413e-b352-13ac8f88da63" (UID: "d2554051-f8a8-413e-b352-13ac8f88da63"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.429560 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d2554051-f8a8-413e-b352-13ac8f88da63" (UID: "d2554051-f8a8-413e-b352-13ac8f88da63"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.495652 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clwl9\" (UniqueName: \"kubernetes.io/projected/d2554051-f8a8-413e-b352-13ac8f88da63-kube-api-access-clwl9\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.495692 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.495703 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d2554051-f8a8-413e-b352-13ac8f88da63-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.774809 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" event={"ID":"d2554051-f8a8-413e-b352-13ac8f88da63","Type":"ContainerDied","Data":"3b05bb912a720c291d17fc2bbbe50e4aac5ac97aed9f966b3b071024a551d0de"} Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.775145 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b05bb912a720c291d17fc2bbbe50e4aac5ac97aed9f966b3b071024a551d0de" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.774916 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-2hf52" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.868622 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4"] Feb 19 19:43:49 crc kubenswrapper[4722]: E0219 19:43:49.869223 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2554051-f8a8-413e-b352-13ac8f88da63" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.869254 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2554051-f8a8-413e-b352-13ac8f88da63" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.869515 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2554051-f8a8-413e-b352-13ac8f88da63" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.870425 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.874714 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.875001 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.875080 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.875265 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:43:49 crc kubenswrapper[4722]: I0219 19:43:49.889537 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4"] Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.007275 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.007464 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.007493 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.007575 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm4vd\" (UniqueName: \"kubernetes.io/projected/7573aaf8-263a-4e50-84da-58cf311829a9-kube-api-access-mm4vd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.110375 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.110579 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.110797 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.110895 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm4vd\" (UniqueName: \"kubernetes.io/projected/7573aaf8-263a-4e50-84da-58cf311829a9-kube-api-access-mm4vd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.115836 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.115996 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.116128 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.128945 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm4vd\" (UniqueName: \"kubernetes.io/projected/7573aaf8-263a-4e50-84da-58cf311829a9-kube-api-access-mm4vd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.194920 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.753334 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4"] Feb 19 19:43:50 crc kubenswrapper[4722]: I0219 19:43:50.786877 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" event={"ID":"7573aaf8-263a-4e50-84da-58cf311829a9","Type":"ContainerStarted","Data":"6f9d6eb7165cf38cb4798873ca7e4eb22283d0374e3333f832035a7b8aca2450"} Feb 19 19:43:51 crc kubenswrapper[4722]: I0219 19:43:51.798594 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" event={"ID":"7573aaf8-263a-4e50-84da-58cf311829a9","Type":"ContainerStarted","Data":"ed27796d3d25748986df212c115b101ecfee62c9f9764796d2d2ee4e35289aef"} Feb 19 19:43:51 crc kubenswrapper[4722]: I0219 19:43:51.818789 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" podStartSLOduration=2.3963085570000002 podStartE2EDuration="2.818769465s" podCreationTimestamp="2026-02-19 19:43:49 +0000 UTC" firstStartedPulling="2026-02-19 19:43:50.748441197 +0000 UTC m=+1530.360791521" lastFinishedPulling="2026-02-19 19:43:51.170902105 +0000 UTC m=+1530.783252429" observedRunningTime="2026-02-19 19:43:51.818233678 +0000 UTC m=+1531.430584002" watchObservedRunningTime="2026-02-19 19:43:51.818769465 +0000 UTC m=+1531.431119789" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.519620 4722 scope.go:117] "RemoveContainer" containerID="0cb57e5ce54d4ebdcfc5834077ee30754bec175aed42d1c77310f409f5adb33c" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.581509 4722 scope.go:117] "RemoveContainer" containerID="39d3bd74fcad2b2ba6a5d3be195f9ef849a5a1caabbd2723eb1f1b100ba3c28c" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.617545 4722 scope.go:117] "RemoveContainer" containerID="3ce9bc56dc0250472fbd7d818bb628d5fdf7798657a6fd7b1570bd5c3b64c1ae" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.697602 4722 scope.go:117] "RemoveContainer" containerID="6956d55506ad813de368c67533400189dca7fad85038770d3e67703d4229d5da" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.732265 4722 scope.go:117] "RemoveContainer" containerID="0f3ddcaf8c81704eaf6b201c98a6bdf76e2b380c4dac2d9db9d77cb9f737e62a" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.785690 4722 scope.go:117] "RemoveContainer" containerID="8fb5c1c0ec360aa5fc271ce7683847ce4ebe5cbb2a0793d19d34b7cc7bc220b8" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.826092 4722 scope.go:117] "RemoveContainer" containerID="8abd067186838cbd1efbd6d007696dcd996ec432757392f167f24e47f4f57171" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.862357 4722 scope.go:117] "RemoveContainer" containerID="26f23b94ceca02366d6ad7b5b51d95589832118420b7f024d6cc30a861e72a4d" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.881603 4722 scope.go:117] "RemoveContainer" containerID="df36524cd2a523caf0ae3f85ddef265e7c54e5ba8fa2da85c3fd083ca4ebd887" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.904971 4722 scope.go:117] "RemoveContainer" containerID="4dec94c6774384698a0cf861b554d74fb1ddd8514338b3e11d17056ce861d124" Feb 19 19:44:42 crc kubenswrapper[4722]: I0219 19:44:42.925942 4722 scope.go:117] "RemoveContainer" containerID="17e885ee19d45823afa31ec6273541ee2f4327ad3250b341ab5883d6c0baed3b" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.159215 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm"] Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.161396 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.164084 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.164625 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.179690 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm"] Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.328249 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-secret-volume\") pod \"collect-profiles-29525505-j4pzm\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.328485 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-config-volume\") pod \"collect-profiles-29525505-j4pzm\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.328575 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6kz8\" (UniqueName: \"kubernetes.io/projected/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-kube-api-access-d6kz8\") pod \"collect-profiles-29525505-j4pzm\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.430280 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-config-volume\") pod \"collect-profiles-29525505-j4pzm\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.430404 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6kz8\" (UniqueName: \"kubernetes.io/projected/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-kube-api-access-d6kz8\") pod \"collect-profiles-29525505-j4pzm\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.430491 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-secret-volume\") pod \"collect-profiles-29525505-j4pzm\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.431309 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-config-volume\") pod \"collect-profiles-29525505-j4pzm\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.438760 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-secret-volume\") pod \"collect-profiles-29525505-j4pzm\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.450258 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6kz8\" (UniqueName: \"kubernetes.io/projected/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-kube-api-access-d6kz8\") pod \"collect-profiles-29525505-j4pzm\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.494984 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:00 crc kubenswrapper[4722]: I0219 19:45:00.990333 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm"] Feb 19 19:45:01 crc kubenswrapper[4722]: I0219 19:45:01.577741 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" event={"ID":"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6","Type":"ContainerStarted","Data":"75756214ecf739e6539c33ec90e775742f12ea9cf526026780602fab4300d835"} Feb 19 19:45:01 crc kubenswrapper[4722]: I0219 19:45:01.578051 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" event={"ID":"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6","Type":"ContainerStarted","Data":"2282f5e8c1fe4ccd890b8006981551ba2548f12c130fc4a075dd47c446dd0b2b"} Feb 19 19:45:01 crc kubenswrapper[4722]: I0219 19:45:01.599405 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" podStartSLOduration=1.599291807 podStartE2EDuration="1.599291807s" podCreationTimestamp="2026-02-19 19:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 19:45:01.594010602 +0000 UTC m=+1601.206360936" watchObservedRunningTime="2026-02-19 19:45:01.599291807 +0000 UTC m=+1601.211642131" Feb 19 19:45:02 crc kubenswrapper[4722]: I0219 19:45:02.588007 4722 generic.go:334] "Generic (PLEG): container finished" podID="de7a45c0-a648-4a25-95f2-c0fa1dd70cf6" containerID="75756214ecf739e6539c33ec90e775742f12ea9cf526026780602fab4300d835" exitCode=0 Feb 19 19:45:02 crc kubenswrapper[4722]: I0219 19:45:02.588406 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" event={"ID":"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6","Type":"ContainerDied","Data":"75756214ecf739e6539c33ec90e775742f12ea9cf526026780602fab4300d835"} Feb 19 19:45:03 crc kubenswrapper[4722]: I0219 19:45:03.995800 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.105269 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6kz8\" (UniqueName: \"kubernetes.io/projected/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-kube-api-access-d6kz8\") pod \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.105577 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-config-volume\") pod \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.105824 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-secret-volume\") pod \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\" (UID: \"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6\") " Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.106349 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-config-volume" (OuterVolumeSpecName: "config-volume") pod "de7a45c0-a648-4a25-95f2-c0fa1dd70cf6" (UID: "de7a45c0-a648-4a25-95f2-c0fa1dd70cf6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.106868 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.111065 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "de7a45c0-a648-4a25-95f2-c0fa1dd70cf6" (UID: "de7a45c0-a648-4a25-95f2-c0fa1dd70cf6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.111371 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-kube-api-access-d6kz8" (OuterVolumeSpecName: "kube-api-access-d6kz8") pod "de7a45c0-a648-4a25-95f2-c0fa1dd70cf6" (UID: "de7a45c0-a648-4a25-95f2-c0fa1dd70cf6"). InnerVolumeSpecName "kube-api-access-d6kz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.209067 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6kz8\" (UniqueName: \"kubernetes.io/projected/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-kube-api-access-d6kz8\") on node \"crc\" DevicePath \"\"" Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.209109 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/de7a45c0-a648-4a25-95f2-c0fa1dd70cf6-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.613907 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" event={"ID":"de7a45c0-a648-4a25-95f2-c0fa1dd70cf6","Type":"ContainerDied","Data":"2282f5e8c1fe4ccd890b8006981551ba2548f12c130fc4a075dd47c446dd0b2b"} Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.613955 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2282f5e8c1fe4ccd890b8006981551ba2548f12c130fc4a075dd47c446dd0b2b" Feb 19 19:45:04 crc kubenswrapper[4722]: I0219 19:45:04.614019 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525505-j4pzm" Feb 19 19:45:11 crc kubenswrapper[4722]: I0219 19:45:11.798992 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:45:11 crc kubenswrapper[4722]: I0219 19:45:11.799643 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:45:41 crc kubenswrapper[4722]: I0219 19:45:41.798257 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:45:41 crc kubenswrapper[4722]: I0219 19:45:41.798662 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:45:43 crc kubenswrapper[4722]: I0219 19:45:43.271605 4722 scope.go:117] "RemoveContainer" containerID="2b66f446c38c7a939e26a897ca89dd08de59cc960c89adaf35fbd0e82bf8f636" Feb 19 19:45:43 crc kubenswrapper[4722]: I0219 19:45:43.296002 4722 scope.go:117] "RemoveContainer" containerID="0546c702603104f43bbaaf99f3fe718c40fad148666fb0d4d8b70707d6802f06" Feb 19 19:45:43 crc kubenswrapper[4722]: I0219 19:45:43.315795 4722 scope.go:117] "RemoveContainer" containerID="13413006ae1624571bd31498af1bfba16b06dc1ae973f9ef0d89f06ecc4ef187" Feb 19 19:45:43 crc kubenswrapper[4722]: I0219 19:45:43.342591 4722 scope.go:117] "RemoveContainer" containerID="8fba7a7dd2b4b36b32712f1263954190cba9206e6fe4eb845c3663a36d4748db" Feb 19 19:45:43 crc kubenswrapper[4722]: I0219 19:45:43.392319 4722 scope.go:117] "RemoveContainer" containerID="14558b2b43b12bd6f938bfe33b938c7705b1528f8c8be67e451dfa9069d61fa8" Feb 19 19:46:11 crc kubenswrapper[4722]: I0219 19:46:11.798593 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:46:11 crc kubenswrapper[4722]: I0219 19:46:11.799111 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:46:11 crc kubenswrapper[4722]: I0219 19:46:11.799172 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:46:11 crc kubenswrapper[4722]: I0219 19:46:11.799917 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:46:11 crc kubenswrapper[4722]: I0219 19:46:11.799964 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" gracePeriod=600 Feb 19 19:46:11 crc kubenswrapper[4722]: E0219 19:46:11.987544 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:46:12 crc kubenswrapper[4722]: I0219 19:46:12.524524 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" exitCode=0 Feb 19 19:46:12 crc kubenswrapper[4722]: I0219 19:46:12.524571 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc"} Feb 19 19:46:12 crc kubenswrapper[4722]: I0219 19:46:12.524619 4722 scope.go:117] "RemoveContainer" containerID="5d87fcbd7a996e41ecc379a7fc5d8fec55b99f8916d82ec5d3e1bb7181cace17" Feb 19 19:46:12 crc kubenswrapper[4722]: I0219 19:46:12.525660 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:46:12 crc kubenswrapper[4722]: E0219 19:46:12.526632 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:46:26 crc kubenswrapper[4722]: I0219 19:46:26.071720 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:46:26 crc kubenswrapper[4722]: E0219 19:46:26.072489 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:46:37 crc kubenswrapper[4722]: I0219 19:46:37.072416 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:46:37 crc kubenswrapper[4722]: E0219 19:46:37.073650 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:46:43 crc kubenswrapper[4722]: I0219 19:46:43.510558 4722 scope.go:117] "RemoveContainer" containerID="a1a51025c6ac3a0493c572c91d9a6b9ce6a00a5a4e017dc5fcf2b5b985ce7e56" Feb 19 19:46:43 crc kubenswrapper[4722]: I0219 19:46:43.538971 4722 scope.go:117] "RemoveContainer" containerID="1ed5b5b084253c379ef4f64ca1d2a98bf7db526329e58a539659a2694681f3a0" Feb 19 19:46:47 crc kubenswrapper[4722]: I0219 19:46:47.909897 4722 generic.go:334] "Generic (PLEG): container finished" podID="7573aaf8-263a-4e50-84da-58cf311829a9" containerID="ed27796d3d25748986df212c115b101ecfee62c9f9764796d2d2ee4e35289aef" exitCode=0 Feb 19 19:46:47 crc kubenswrapper[4722]: I0219 19:46:47.909976 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" event={"ID":"7573aaf8-263a-4e50-84da-58cf311829a9","Type":"ContainerDied","Data":"ed27796d3d25748986df212c115b101ecfee62c9f9764796d2d2ee4e35289aef"} Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.421303 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.570635 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-inventory\") pod \"7573aaf8-263a-4e50-84da-58cf311829a9\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.571137 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-ssh-key-openstack-edpm-ipam\") pod \"7573aaf8-263a-4e50-84da-58cf311829a9\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.571207 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-bootstrap-combined-ca-bundle\") pod \"7573aaf8-263a-4e50-84da-58cf311829a9\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.571964 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm4vd\" (UniqueName: \"kubernetes.io/projected/7573aaf8-263a-4e50-84da-58cf311829a9-kube-api-access-mm4vd\") pod \"7573aaf8-263a-4e50-84da-58cf311829a9\" (UID: \"7573aaf8-263a-4e50-84da-58cf311829a9\") " Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.576050 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7573aaf8-263a-4e50-84da-58cf311829a9" (UID: "7573aaf8-263a-4e50-84da-58cf311829a9"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.580942 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7573aaf8-263a-4e50-84da-58cf311829a9-kube-api-access-mm4vd" (OuterVolumeSpecName: "kube-api-access-mm4vd") pod "7573aaf8-263a-4e50-84da-58cf311829a9" (UID: "7573aaf8-263a-4e50-84da-58cf311829a9"). InnerVolumeSpecName "kube-api-access-mm4vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.597384 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-inventory" (OuterVolumeSpecName: "inventory") pod "7573aaf8-263a-4e50-84da-58cf311829a9" (UID: "7573aaf8-263a-4e50-84da-58cf311829a9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.607230 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7573aaf8-263a-4e50-84da-58cf311829a9" (UID: "7573aaf8-263a-4e50-84da-58cf311829a9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.675285 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.675320 4722 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.675331 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm4vd\" (UniqueName: \"kubernetes.io/projected/7573aaf8-263a-4e50-84da-58cf311829a9-kube-api-access-mm4vd\") on node \"crc\" DevicePath \"\"" Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.675341 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7573aaf8-263a-4e50-84da-58cf311829a9-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.926659 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" event={"ID":"7573aaf8-263a-4e50-84da-58cf311829a9","Type":"ContainerDied","Data":"6f9d6eb7165cf38cb4798873ca7e4eb22283d0374e3333f832035a7b8aca2450"} Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.926698 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f9d6eb7165cf38cb4798873ca7e4eb22283d0374e3333f832035a7b8aca2450" Feb 19 19:46:49 crc kubenswrapper[4722]: I0219 19:46:49.926759 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.032642 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8"] Feb 19 19:46:50 crc kubenswrapper[4722]: E0219 19:46:50.033226 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de7a45c0-a648-4a25-95f2-c0fa1dd70cf6" containerName="collect-profiles" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.033255 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="de7a45c0-a648-4a25-95f2-c0fa1dd70cf6" containerName="collect-profiles" Feb 19 19:46:50 crc kubenswrapper[4722]: E0219 19:46:50.033300 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7573aaf8-263a-4e50-84da-58cf311829a9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.033310 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7573aaf8-263a-4e50-84da-58cf311829a9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.033549 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7573aaf8-263a-4e50-84da-58cf311829a9" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.033579 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="de7a45c0-a648-4a25-95f2-c0fa1dd70cf6" containerName="collect-profiles" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.034476 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.039667 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.039885 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.040035 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.040194 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.048203 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8"] Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.191431 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.191623 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.191687 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w26gw\" (UniqueName: \"kubernetes.io/projected/23a67d89-596c-44f0-b19d-dc5d1eb3021e-kube-api-access-w26gw\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.293909 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.293984 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w26gw\" (UniqueName: \"kubernetes.io/projected/23a67d89-596c-44f0-b19d-dc5d1eb3021e-kube-api-access-w26gw\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.294035 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.300357 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.301069 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.316707 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w26gw\" (UniqueName: \"kubernetes.io/projected/23a67d89-596c-44f0-b19d-dc5d1eb3021e-kube-api-access-w26gw\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.353472 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.909900 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8"] Feb 19 19:46:50 crc kubenswrapper[4722]: W0219 19:46:50.912827 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23a67d89_596c_44f0_b19d_dc5d1eb3021e.slice/crio-648936524c7543748a566d0843fa93700b53feefaad86824db4e242f5b14fdda WatchSource:0}: Error finding container 648936524c7543748a566d0843fa93700b53feefaad86824db4e242f5b14fdda: Status 404 returned error can't find the container with id 648936524c7543748a566d0843fa93700b53feefaad86824db4e242f5b14fdda Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.916477 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:46:50 crc kubenswrapper[4722]: I0219 19:46:50.937333 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" event={"ID":"23a67d89-596c-44f0-b19d-dc5d1eb3021e","Type":"ContainerStarted","Data":"648936524c7543748a566d0843fa93700b53feefaad86824db4e242f5b14fdda"} Feb 19 19:46:51 crc kubenswrapper[4722]: I0219 19:46:51.082991 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:46:51 crc kubenswrapper[4722]: E0219 19:46:51.083698 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:46:51 crc kubenswrapper[4722]: I0219 19:46:51.952368 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" event={"ID":"23a67d89-596c-44f0-b19d-dc5d1eb3021e","Type":"ContainerStarted","Data":"5a4c6d10bcfa0da53cc4b9e38924013e1f28f0ece8007cef9ebd1b78c76f2e64"} Feb 19 19:46:51 crc kubenswrapper[4722]: I0219 19:46:51.980980 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" podStartSLOduration=1.585481175 podStartE2EDuration="1.980949241s" podCreationTimestamp="2026-02-19 19:46:50 +0000 UTC" firstStartedPulling="2026-02-19 19:46:50.916184916 +0000 UTC m=+1710.528535260" lastFinishedPulling="2026-02-19 19:46:51.311652992 +0000 UTC m=+1710.924003326" observedRunningTime="2026-02-19 19:46:51.976626276 +0000 UTC m=+1711.588976640" watchObservedRunningTime="2026-02-19 19:46:51.980949241 +0000 UTC m=+1711.593299605" Feb 19 19:47:02 crc kubenswrapper[4722]: I0219 19:47:02.071590 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:47:02 crc kubenswrapper[4722]: E0219 19:47:02.072687 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:47:15 crc kubenswrapper[4722]: I0219 19:47:15.071633 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:47:15 crc kubenswrapper[4722]: E0219 19:47:15.072429 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:47:28 crc kubenswrapper[4722]: I0219 19:47:28.067697 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-5m87g"] Feb 19 19:47:28 crc kubenswrapper[4722]: I0219 19:47:28.085084 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-5m87g"] Feb 19 19:47:29 crc kubenswrapper[4722]: I0219 19:47:29.071607 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:47:29 crc kubenswrapper[4722]: E0219 19:47:29.071946 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:47:29 crc kubenswrapper[4722]: I0219 19:47:29.083983 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f940a76-c93f-46c5-af29-5b098a54adc8" path="/var/lib/kubelet/pods/1f940a76-c93f-46c5-af29-5b098a54adc8/volumes" Feb 19 19:47:30 crc kubenswrapper[4722]: I0219 19:47:30.037556 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1f02-account-create-update-cslgg"] Feb 19 19:47:30 crc kubenswrapper[4722]: I0219 19:47:30.046945 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1f02-account-create-update-cslgg"] Feb 19 19:47:31 crc kubenswrapper[4722]: I0219 19:47:31.083530 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03387e77-59d8-4377-9a1c-dac948d84b59" path="/var/lib/kubelet/pods/03387e77-59d8-4377-9a1c-dac948d84b59/volumes" Feb 19 19:47:36 crc kubenswrapper[4722]: I0219 19:47:36.046287 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-4b7g9"] Feb 19 19:47:36 crc kubenswrapper[4722]: I0219 19:47:36.062454 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-4b7g9"] Feb 19 19:47:37 crc kubenswrapper[4722]: I0219 19:47:37.034848 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c526-account-create-update-lmx4k"] Feb 19 19:47:37 crc kubenswrapper[4722]: I0219 19:47:37.047345 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-c526-account-create-update-lmx4k"] Feb 19 19:47:37 crc kubenswrapper[4722]: I0219 19:47:37.058737 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-lqnqr"] Feb 19 19:47:37 crc kubenswrapper[4722]: I0219 19:47:37.069640 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2439-account-create-update-lqmn5"] Feb 19 19:47:37 crc kubenswrapper[4722]: I0219 19:47:37.085561 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bd3ad13-0324-4c1c-9b74-eb1401f06507" path="/var/lib/kubelet/pods/5bd3ad13-0324-4c1c-9b74-eb1401f06507/volumes" Feb 19 19:47:37 crc kubenswrapper[4722]: I0219 19:47:37.086543 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93536b6f-8176-4737-a547-9face2995981" path="/var/lib/kubelet/pods/93536b6f-8176-4737-a547-9face2995981/volumes" Feb 19 19:47:37 crc kubenswrapper[4722]: I0219 19:47:37.087232 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2439-account-create-update-lqmn5"] Feb 19 19:47:37 crc kubenswrapper[4722]: I0219 19:47:37.090538 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-lqnqr"] Feb 19 19:47:39 crc kubenswrapper[4722]: I0219 19:47:39.084536 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="248de930-2ecc-4ca2-9b2c-e9b8ccbc6358" path="/var/lib/kubelet/pods/248de930-2ecc-4ca2-9b2c-e9b8ccbc6358/volumes" Feb 19 19:47:39 crc kubenswrapper[4722]: I0219 19:47:39.085134 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44afb335-8449-4492-a772-78889877810e" path="/var/lib/kubelet/pods/44afb335-8449-4492-a772-78889877810e/volumes" Feb 19 19:47:40 crc kubenswrapper[4722]: I0219 19:47:40.031980 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-2wnjc"] Feb 19 19:47:40 crc kubenswrapper[4722]: I0219 19:47:40.045174 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-2wnjc"] Feb 19 19:47:41 crc kubenswrapper[4722]: I0219 19:47:41.083996 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8" path="/var/lib/kubelet/pods/6fc542bf-bcc1-48b0-b0d9-a1c4e2702cc8/volumes" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.072231 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:47:43 crc kubenswrapper[4722]: E0219 19:47:43.072768 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.591274 4722 scope.go:117] "RemoveContainer" containerID="99c98b71002ac8948511844b6989a0da14ae66e034112843908355f3a72c44e7" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.624819 4722 scope.go:117] "RemoveContainer" containerID="8aa3bea30fad3f939a077228a9ed1250c050038afc03ce315c796a876ab91692" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.690962 4722 scope.go:117] "RemoveContainer" containerID="65724bcd3ed9cb9dac1ea77b176d69bbb52e388afbda6a5fe57b607a6390a7e4" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.746695 4722 scope.go:117] "RemoveContainer" containerID="d614fd1da3e70b89a53ee5e8d38b91ca481cc6e55ebe3919a12aefd8b96f7538" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.809243 4722 scope.go:117] "RemoveContainer" containerID="2e209875892b5272f7bb00341b24fa8e6b2be48cf1bccfa8acb4859e6aeca425" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.839484 4722 scope.go:117] "RemoveContainer" containerID="e1203e3353e1b22d14cf15e5511afb0b51de1a779175f10f5d565c0c112db8ec" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.883992 4722 scope.go:117] "RemoveContainer" containerID="346ae374bf887f315658e5888cdaaef27ec7de0b0320851ac3b6d0f93d5058e0" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.902548 4722 scope.go:117] "RemoveContainer" containerID="c2f010a6f9fb7a90aca42363ebf34cb5a6a44700de8e1351f8ac807b74981bd2" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.930065 4722 scope.go:117] "RemoveContainer" containerID="33c264ae3ae6e4eeb0fddc45a932c5dedfb68e7bb87b529cf2bce1cde21556b3" Feb 19 19:47:43 crc kubenswrapper[4722]: I0219 19:47:43.955226 4722 scope.go:117] "RemoveContainer" containerID="687c2f6cd621666c11c3a553d69b13af20c5311d98a27db188d1d7153219352e" Feb 19 19:47:50 crc kubenswrapper[4722]: I0219 19:47:50.058728 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-j7hfg"] Feb 19 19:47:50 crc kubenswrapper[4722]: I0219 19:47:50.071789 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-j7hfg"] Feb 19 19:47:51 crc kubenswrapper[4722]: I0219 19:47:51.086726 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44a49a3a-3b7e-4b75-aae8-ba236c1bfc92" path="/var/lib/kubelet/pods/44a49a3a-3b7e-4b75-aae8-ba236c1bfc92/volumes" Feb 19 19:47:55 crc kubenswrapper[4722]: I0219 19:47:55.072350 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:47:55 crc kubenswrapper[4722]: E0219 19:47:55.073239 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.053484 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-eefc-account-create-update-h8n6c"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.146344 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-nqj2r"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.157205 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-67cbt"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.166197 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-89e2-account-create-update-7656w"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.181236 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-36cd-account-create-update-r5498"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.192385 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-eefc-account-create-update-h8n6c"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.199220 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-67cbt"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.224226 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-nqj2r"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.230224 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-36cd-account-create-update-r5498"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.240546 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-89e2-account-create-update-7656w"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.253140 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-7kcsc"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.262209 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-7kcsc"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.271196 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0f99-account-create-update-fflhf"] Feb 19 19:47:57 crc kubenswrapper[4722]: I0219 19:47:57.280076 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0f99-account-create-update-fflhf"] Feb 19 19:47:59 crc kubenswrapper[4722]: I0219 19:47:59.028653 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-8fd9q"] Feb 19 19:47:59 crc kubenswrapper[4722]: I0219 19:47:59.037712 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-8fd9q"] Feb 19 19:47:59 crc kubenswrapper[4722]: I0219 19:47:59.085604 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2039a569-0bc4-49a4-9e82-08964729dc7b" path="/var/lib/kubelet/pods/2039a569-0bc4-49a4-9e82-08964729dc7b/volumes" Feb 19 19:47:59 crc kubenswrapper[4722]: I0219 19:47:59.087506 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="217ea569-e058-4f21-bbb7-d2f2648375eb" path="/var/lib/kubelet/pods/217ea569-e058-4f21-bbb7-d2f2648375eb/volumes" Feb 19 19:47:59 crc kubenswrapper[4722]: I0219 19:47:59.088083 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25905c52-4074-40d4-826f-ef89353eeaa6" path="/var/lib/kubelet/pods/25905c52-4074-40d4-826f-ef89353eeaa6/volumes" Feb 19 19:47:59 crc kubenswrapper[4722]: I0219 19:47:59.088677 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="619d59b3-6514-4648-9007-6e9ce3427c3a" path="/var/lib/kubelet/pods/619d59b3-6514-4648-9007-6e9ce3427c3a/volumes" Feb 19 19:47:59 crc kubenswrapper[4722]: I0219 19:47:59.089924 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8d81d51-f4b7-4dec-9548-982de19b4742" path="/var/lib/kubelet/pods/a8d81d51-f4b7-4dec-9548-982de19b4742/volumes" Feb 19 19:47:59 crc kubenswrapper[4722]: I0219 19:47:59.090477 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c78d063e-7cd7-4b41-b148-1a7f9a3f9914" path="/var/lib/kubelet/pods/c78d063e-7cd7-4b41-b148-1a7f9a3f9914/volumes" Feb 19 19:47:59 crc kubenswrapper[4722]: I0219 19:47:59.091077 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5778eec-eb7e-4137-85bd-761ac78b9fd7" path="/var/lib/kubelet/pods/d5778eec-eb7e-4137-85bd-761ac78b9fd7/volumes" Feb 19 19:47:59 crc kubenswrapper[4722]: I0219 19:47:59.092263 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe445148-46c0-4e8c-844a-51a5ce323370" path="/var/lib/kubelet/pods/fe445148-46c0-4e8c-844a-51a5ce323370/volumes" Feb 19 19:48:02 crc kubenswrapper[4722]: I0219 19:48:02.029242 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-ws9fr"] Feb 19 19:48:02 crc kubenswrapper[4722]: I0219 19:48:02.040633 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-ws9fr"] Feb 19 19:48:03 crc kubenswrapper[4722]: I0219 19:48:03.085380 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4dc7071-7951-4302-96d9-ef7e4f7f2ceb" path="/var/lib/kubelet/pods/a4dc7071-7951-4302-96d9-ef7e4f7f2ceb/volumes" Feb 19 19:48:07 crc kubenswrapper[4722]: I0219 19:48:07.071490 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:48:07 crc kubenswrapper[4722]: E0219 19:48:07.072139 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:48:18 crc kubenswrapper[4722]: I0219 19:48:18.071263 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:48:18 crc kubenswrapper[4722]: E0219 19:48:18.072055 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:48:29 crc kubenswrapper[4722]: I0219 19:48:29.071739 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:48:29 crc kubenswrapper[4722]: E0219 19:48:29.072637 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:48:32 crc kubenswrapper[4722]: I0219 19:48:32.044973 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-7b98l"] Feb 19 19:48:32 crc kubenswrapper[4722]: I0219 19:48:32.055544 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-7b98l"] Feb 19 19:48:33 crc kubenswrapper[4722]: I0219 19:48:33.084414 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eab1ce59-2254-419a-bab0-cf5e87888634" path="/var/lib/kubelet/pods/eab1ce59-2254-419a-bab0-cf5e87888634/volumes" Feb 19 19:48:33 crc kubenswrapper[4722]: I0219 19:48:33.673496 4722 generic.go:334] "Generic (PLEG): container finished" podID="23a67d89-596c-44f0-b19d-dc5d1eb3021e" containerID="5a4c6d10bcfa0da53cc4b9e38924013e1f28f0ece8007cef9ebd1b78c76f2e64" exitCode=0 Feb 19 19:48:33 crc kubenswrapper[4722]: I0219 19:48:33.673621 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" event={"ID":"23a67d89-596c-44f0-b19d-dc5d1eb3021e","Type":"ContainerDied","Data":"5a4c6d10bcfa0da53cc4b9e38924013e1f28f0ece8007cef9ebd1b78c76f2e64"} Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.240854 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.434947 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-inventory\") pod \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.435197 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-ssh-key-openstack-edpm-ipam\") pod \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.435332 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w26gw\" (UniqueName: \"kubernetes.io/projected/23a67d89-596c-44f0-b19d-dc5d1eb3021e-kube-api-access-w26gw\") pod \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\" (UID: \"23a67d89-596c-44f0-b19d-dc5d1eb3021e\") " Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.440891 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a67d89-596c-44f0-b19d-dc5d1eb3021e-kube-api-access-w26gw" (OuterVolumeSpecName: "kube-api-access-w26gw") pod "23a67d89-596c-44f0-b19d-dc5d1eb3021e" (UID: "23a67d89-596c-44f0-b19d-dc5d1eb3021e"). InnerVolumeSpecName "kube-api-access-w26gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.465757 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "23a67d89-596c-44f0-b19d-dc5d1eb3021e" (UID: "23a67d89-596c-44f0-b19d-dc5d1eb3021e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.469098 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-inventory" (OuterVolumeSpecName: "inventory") pod "23a67d89-596c-44f0-b19d-dc5d1eb3021e" (UID: "23a67d89-596c-44f0-b19d-dc5d1eb3021e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.538419 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.538465 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w26gw\" (UniqueName: \"kubernetes.io/projected/23a67d89-596c-44f0-b19d-dc5d1eb3021e-kube-api-access-w26gw\") on node \"crc\" DevicePath \"\"" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.538478 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23a67d89-596c-44f0-b19d-dc5d1eb3021e-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.707205 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" event={"ID":"23a67d89-596c-44f0-b19d-dc5d1eb3021e","Type":"ContainerDied","Data":"648936524c7543748a566d0843fa93700b53feefaad86824db4e242f5b14fdda"} Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.708094 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="648936524c7543748a566d0843fa93700b53feefaad86824db4e242f5b14fdda" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.707296 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.800728 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x"] Feb 19 19:48:35 crc kubenswrapper[4722]: E0219 19:48:35.802560 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a67d89-596c-44f0-b19d-dc5d1eb3021e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.802599 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a67d89-596c-44f0-b19d-dc5d1eb3021e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.802842 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a67d89-596c-44f0-b19d-dc5d1eb3021e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.803806 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.806079 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.807100 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.807127 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.807255 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.814192 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x"] Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.845651 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzxv6\" (UniqueName: \"kubernetes.io/projected/7a9a8806-dadf-4cd5-af24-fc35c7e52197-kube-api-access-zzxv6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7m66x\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.845880 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7m66x\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.845914 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7m66x\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.947696 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7m66x\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.947755 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7m66x\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.947878 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzxv6\" (UniqueName: \"kubernetes.io/projected/7a9a8806-dadf-4cd5-af24-fc35c7e52197-kube-api-access-zzxv6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7m66x\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.960853 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7m66x\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.960876 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7m66x\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:35 crc kubenswrapper[4722]: I0219 19:48:35.969997 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzxv6\" (UniqueName: \"kubernetes.io/projected/7a9a8806-dadf-4cd5-af24-fc35c7e52197-kube-api-access-zzxv6\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-7m66x\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:36 crc kubenswrapper[4722]: I0219 19:48:36.123407 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:48:36 crc kubenswrapper[4722]: I0219 19:48:36.677925 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x"] Feb 19 19:48:36 crc kubenswrapper[4722]: I0219 19:48:36.721088 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" event={"ID":"7a9a8806-dadf-4cd5-af24-fc35c7e52197","Type":"ContainerStarted","Data":"fc864cb4377eebf89734a22ccaa77d70ed0e86aab196f4dc6ade6eea7c72341d"} Feb 19 19:48:37 crc kubenswrapper[4722]: I0219 19:48:37.732699 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" event={"ID":"7a9a8806-dadf-4cd5-af24-fc35c7e52197","Type":"ContainerStarted","Data":"35295d18877f348be9df39c76a49295d5c3dcfa4c41c129460eba234068337c9"} Feb 19 19:48:37 crc kubenswrapper[4722]: I0219 19:48:37.758743 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" podStartSLOduration=2.326646943 podStartE2EDuration="2.758725407s" podCreationTimestamp="2026-02-19 19:48:35 +0000 UTC" firstStartedPulling="2026-02-19 19:48:36.683847138 +0000 UTC m=+1816.296197462" lastFinishedPulling="2026-02-19 19:48:37.115925602 +0000 UTC m=+1816.728275926" observedRunningTime="2026-02-19 19:48:37.747227686 +0000 UTC m=+1817.359578010" watchObservedRunningTime="2026-02-19 19:48:37.758725407 +0000 UTC m=+1817.371075721" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.071121 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:48:44 crc kubenswrapper[4722]: E0219 19:48:44.071918 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.205075 4722 scope.go:117] "RemoveContainer" containerID="bb275fbcbbe35a94955e26075778ab6128134f99af8b8d18b788e7b11aac61c6" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.235952 4722 scope.go:117] "RemoveContainer" containerID="bf1fddeb0ef2831ba2e02a1aa709a530121f690fbf768791dd2408b9c18e9009" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.282051 4722 scope.go:117] "RemoveContainer" containerID="42d0c57b026c599554638595f7678853fcba7c141ed4152a46e9c34dcadec9ce" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.328327 4722 scope.go:117] "RemoveContainer" containerID="a4f4b237835194ac1fcedd350c7532fc74f42e672c498f5c9cea05272f6986a0" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.396143 4722 scope.go:117] "RemoveContainer" containerID="a1c03548ff56ab3102ffaa64e0990092747adeddc1030d3c048e1f3f59e0095b" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.463529 4722 scope.go:117] "RemoveContainer" containerID="6c2e2442beaae76dbd599637b272c7eae6a58710a3bb17eed3e61507df9ea9e0" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.487088 4722 scope.go:117] "RemoveContainer" containerID="8c73c8e1b7d4896f7ab7a5272b3c22c63e7d90ad3033ca9be834b667cd882b7f" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.523723 4722 scope.go:117] "RemoveContainer" containerID="3655d044c293425ea96154111c219b4b647a3c98ed5018f1350933db2f9bafe5" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.554031 4722 scope.go:117] "RemoveContainer" containerID="6b62751b62c97e1ba880132d8b9f91b0968a628ff8eb98b71cf2b1fff30986bd" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.591636 4722 scope.go:117] "RemoveContainer" containerID="24deafb2187b5509b9a503b5cde68eab414e437eef2f36f8141214811c39e398" Feb 19 19:48:44 crc kubenswrapper[4722]: I0219 19:48:44.624425 4722 scope.go:117] "RemoveContainer" containerID="6d49fd861306d1a47364956e09d02157a9618a565198ef080d63694bf02fdc31" Feb 19 19:48:51 crc kubenswrapper[4722]: I0219 19:48:51.042089 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-f6zx8"] Feb 19 19:48:51 crc kubenswrapper[4722]: I0219 19:48:51.056013 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-zrwzj"] Feb 19 19:48:51 crc kubenswrapper[4722]: I0219 19:48:51.067025 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-f6zx8"] Feb 19 19:48:51 crc kubenswrapper[4722]: I0219 19:48:51.085048 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6175472a-2fd6-4b07-bcb1-4e441a4587aa" path="/var/lib/kubelet/pods/6175472a-2fd6-4b07-bcb1-4e441a4587aa/volumes" Feb 19 19:48:51 crc kubenswrapper[4722]: I0219 19:48:51.085648 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-zrwzj"] Feb 19 19:48:53 crc kubenswrapper[4722]: I0219 19:48:53.092018 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41216a8d-32f8-4ec6-ab65-5474453cad03" path="/var/lib/kubelet/pods/41216a8d-32f8-4ec6-ab65-5474453cad03/volumes" Feb 19 19:48:56 crc kubenswrapper[4722]: I0219 19:48:56.071197 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:48:56 crc kubenswrapper[4722]: E0219 19:48:56.071855 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:49:01 crc kubenswrapper[4722]: I0219 19:49:01.035249 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-lnf5k"] Feb 19 19:49:01 crc kubenswrapper[4722]: I0219 19:49:01.045027 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-lnf5k"] Feb 19 19:49:01 crc kubenswrapper[4722]: I0219 19:49:01.082248 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c2453a9-4c81-4256-b52d-edb69c12c7d7" path="/var/lib/kubelet/pods/9c2453a9-4c81-4256-b52d-edb69c12c7d7/volumes" Feb 19 19:49:02 crc kubenswrapper[4722]: I0219 19:49:02.038222 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-nldcm"] Feb 19 19:49:02 crc kubenswrapper[4722]: I0219 19:49:02.048596 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-nldcm"] Feb 19 19:49:03 crc kubenswrapper[4722]: I0219 19:49:03.085899 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="512a4c5e-3ea6-42a8-9f83-8c0e5375891d" path="/var/lib/kubelet/pods/512a4c5e-3ea6-42a8-9f83-8c0e5375891d/volumes" Feb 19 19:49:11 crc kubenswrapper[4722]: I0219 19:49:11.084575 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:49:11 crc kubenswrapper[4722]: E0219 19:49:11.086424 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:49:23 crc kubenswrapper[4722]: I0219 19:49:23.073066 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:49:23 crc kubenswrapper[4722]: E0219 19:49:23.073851 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:49:37 crc kubenswrapper[4722]: I0219 19:49:37.071367 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:49:37 crc kubenswrapper[4722]: E0219 19:49:37.072295 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:49:44 crc kubenswrapper[4722]: I0219 19:49:44.893295 4722 scope.go:117] "RemoveContainer" containerID="30471834ccd229c96e079cf27c896a4ce03111bf3efa26fc347d5a87d8bb97cd" Feb 19 19:49:44 crc kubenswrapper[4722]: I0219 19:49:44.928872 4722 scope.go:117] "RemoveContainer" containerID="90f4e39d24966e113ef88317b89ebc0b17164774e86b8e7cdf9bced518e5ecd6" Feb 19 19:49:44 crc kubenswrapper[4722]: I0219 19:49:44.988992 4722 scope.go:117] "RemoveContainer" containerID="fe4925460ebe652124a5ffa51ecf1f233c20847811e9da501b19b829671482b6" Feb 19 19:49:45 crc kubenswrapper[4722]: I0219 19:49:45.040079 4722 scope.go:117] "RemoveContainer" containerID="c6a2c92ed1dfd6a529b0d6c2d06234eb6f8f5c4b6c0afa3fd878de3dc02ea9ee" Feb 19 19:49:49 crc kubenswrapper[4722]: I0219 19:49:49.601849 4722 generic.go:334] "Generic (PLEG): container finished" podID="7a9a8806-dadf-4cd5-af24-fc35c7e52197" containerID="35295d18877f348be9df39c76a49295d5c3dcfa4c41c129460eba234068337c9" exitCode=0 Feb 19 19:49:49 crc kubenswrapper[4722]: I0219 19:49:49.601938 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" event={"ID":"7a9a8806-dadf-4cd5-af24-fc35c7e52197","Type":"ContainerDied","Data":"35295d18877f348be9df39c76a49295d5c3dcfa4c41c129460eba234068337c9"} Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.140119 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.201403 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-ssh-key-openstack-edpm-ipam\") pod \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.201480 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzxv6\" (UniqueName: \"kubernetes.io/projected/7a9a8806-dadf-4cd5-af24-fc35c7e52197-kube-api-access-zzxv6\") pod \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.201638 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-inventory\") pod \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\" (UID: \"7a9a8806-dadf-4cd5-af24-fc35c7e52197\") " Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.208493 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a9a8806-dadf-4cd5-af24-fc35c7e52197-kube-api-access-zzxv6" (OuterVolumeSpecName: "kube-api-access-zzxv6") pod "7a9a8806-dadf-4cd5-af24-fc35c7e52197" (UID: "7a9a8806-dadf-4cd5-af24-fc35c7e52197"). InnerVolumeSpecName "kube-api-access-zzxv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.243375 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-inventory" (OuterVolumeSpecName: "inventory") pod "7a9a8806-dadf-4cd5-af24-fc35c7e52197" (UID: "7a9a8806-dadf-4cd5-af24-fc35c7e52197"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.245618 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7a9a8806-dadf-4cd5-af24-fc35c7e52197" (UID: "7a9a8806-dadf-4cd5-af24-fc35c7e52197"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.303978 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.304016 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7a9a8806-dadf-4cd5-af24-fc35c7e52197-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.304039 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzxv6\" (UniqueName: \"kubernetes.io/projected/7a9a8806-dadf-4cd5-af24-fc35c7e52197-kube-api-access-zzxv6\") on node \"crc\" DevicePath \"\"" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.639327 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" event={"ID":"7a9a8806-dadf-4cd5-af24-fc35c7e52197","Type":"ContainerDied","Data":"fc864cb4377eebf89734a22ccaa77d70ed0e86aab196f4dc6ade6eea7c72341d"} Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.639416 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc864cb4377eebf89734a22ccaa77d70ed0e86aab196f4dc6ade6eea7c72341d" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.639414 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-7m66x" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.720603 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v"] Feb 19 19:49:51 crc kubenswrapper[4722]: E0219 19:49:51.721170 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9a8806-dadf-4cd5-af24-fc35c7e52197" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.721196 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9a8806-dadf-4cd5-af24-fc35c7e52197" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.721450 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a9a8806-dadf-4cd5-af24-fc35c7e52197" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.722399 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.729100 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.729537 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.729722 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.730238 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.733298 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v"] Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.813246 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.813307 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.813367 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm9tm\" (UniqueName: \"kubernetes.io/projected/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-kube-api-access-gm9tm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.915356 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.915725 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.915773 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm9tm\" (UniqueName: \"kubernetes.io/projected/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-kube-api-access-gm9tm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.919542 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.919646 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:51 crc kubenswrapper[4722]: I0219 19:49:51.933947 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm9tm\" (UniqueName: \"kubernetes.io/projected/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-kube-api-access-gm9tm\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:52 crc kubenswrapper[4722]: I0219 19:49:52.041458 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:52 crc kubenswrapper[4722]: I0219 19:49:52.071593 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:49:52 crc kubenswrapper[4722]: E0219 19:49:52.072036 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:49:52 crc kubenswrapper[4722]: I0219 19:49:52.564962 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v"] Feb 19 19:49:52 crc kubenswrapper[4722]: I0219 19:49:52.651003 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" event={"ID":"b51489f6-90e0-4a0d-ae54-24eb1e6f5568","Type":"ContainerStarted","Data":"86af48b9e83d7d95adbe2df6fe8802abce22c90db515ffce0069648800fa485e"} Feb 19 19:49:53 crc kubenswrapper[4722]: I0219 19:49:53.659768 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" event={"ID":"b51489f6-90e0-4a0d-ae54-24eb1e6f5568","Type":"ContainerStarted","Data":"1c1bf7a6160f737df17b0d7983c1732ea91ec594a77204a5e21603edacd16db0"} Feb 19 19:49:53 crc kubenswrapper[4722]: I0219 19:49:53.682734 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" podStartSLOduration=2.151315433 podStartE2EDuration="2.682711369s" podCreationTimestamp="2026-02-19 19:49:51 +0000 UTC" firstStartedPulling="2026-02-19 19:49:52.567229232 +0000 UTC m=+1892.179579556" lastFinishedPulling="2026-02-19 19:49:53.098625168 +0000 UTC m=+1892.710975492" observedRunningTime="2026-02-19 19:49:53.679805572 +0000 UTC m=+1893.292155906" watchObservedRunningTime="2026-02-19 19:49:53.682711369 +0000 UTC m=+1893.295061693" Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.694493 4722 generic.go:334] "Generic (PLEG): container finished" podID="b51489f6-90e0-4a0d-ae54-24eb1e6f5568" containerID="1c1bf7a6160f737df17b0d7983c1732ea91ec594a77204a5e21603edacd16db0" exitCode=0 Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.694859 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" event={"ID":"b51489f6-90e0-4a0d-ae54-24eb1e6f5568","Type":"ContainerDied","Data":"1c1bf7a6160f737df17b0d7983c1732ea91ec594a77204a5e21603edacd16db0"} Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.701521 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8p2td"] Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.704528 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.709966 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8p2td"] Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.732162 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-catalog-content\") pod \"certified-operators-8p2td\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.732349 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-utilities\") pod \"certified-operators-8p2td\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.732394 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wszsv\" (UniqueName: \"kubernetes.io/projected/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-kube-api-access-wszsv\") pod \"certified-operators-8p2td\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.834912 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-utilities\") pod \"certified-operators-8p2td\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.835014 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wszsv\" (UniqueName: \"kubernetes.io/projected/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-kube-api-access-wszsv\") pod \"certified-operators-8p2td\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.835082 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-catalog-content\") pod \"certified-operators-8p2td\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.835740 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-catalog-content\") pod \"certified-operators-8p2td\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.835742 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-utilities\") pod \"certified-operators-8p2td\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:57 crc kubenswrapper[4722]: I0219 19:49:57.884241 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wszsv\" (UniqueName: \"kubernetes.io/projected/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-kube-api-access-wszsv\") pod \"certified-operators-8p2td\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:58 crc kubenswrapper[4722]: I0219 19:49:58.022197 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:49:58 crc kubenswrapper[4722]: I0219 19:49:58.485054 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8p2td"] Feb 19 19:49:58 crc kubenswrapper[4722]: I0219 19:49:58.705911 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8p2td" event={"ID":"b4b45dc0-a6d0-4572-be7f-93dc70be0a17","Type":"ContainerStarted","Data":"fc6bc749ec8e5faa86281dd9afaa400f15102b634bc38df4e83908220e9f8b43"} Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.191933 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.263635 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-inventory\") pod \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.263743 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm9tm\" (UniqueName: \"kubernetes.io/projected/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-kube-api-access-gm9tm\") pod \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.263778 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-ssh-key-openstack-edpm-ipam\") pod \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\" (UID: \"b51489f6-90e0-4a0d-ae54-24eb1e6f5568\") " Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.278735 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-kube-api-access-gm9tm" (OuterVolumeSpecName: "kube-api-access-gm9tm") pod "b51489f6-90e0-4a0d-ae54-24eb1e6f5568" (UID: "b51489f6-90e0-4a0d-ae54-24eb1e6f5568"). InnerVolumeSpecName "kube-api-access-gm9tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.288566 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm9tm\" (UniqueName: \"kubernetes.io/projected/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-kube-api-access-gm9tm\") on node \"crc\" DevicePath \"\"" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.330959 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-inventory" (OuterVolumeSpecName: "inventory") pod "b51489f6-90e0-4a0d-ae54-24eb1e6f5568" (UID: "b51489f6-90e0-4a0d-ae54-24eb1e6f5568"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.333334 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b51489f6-90e0-4a0d-ae54-24eb1e6f5568" (UID: "b51489f6-90e0-4a0d-ae54-24eb1e6f5568"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.391184 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.391243 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b51489f6-90e0-4a0d-ae54-24eb1e6f5568-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.481302 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rjnjm"] Feb 19 19:49:59 crc kubenswrapper[4722]: E0219 19:49:59.482023 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b51489f6-90e0-4a0d-ae54-24eb1e6f5568" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.482051 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b51489f6-90e0-4a0d-ae54-24eb1e6f5568" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.482321 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b51489f6-90e0-4a0d-ae54-24eb1e6f5568" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.484376 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.494167 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjnjm"] Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.595224 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-utilities\") pod \"redhat-marketplace-rjnjm\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.595479 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-catalog-content\") pod \"redhat-marketplace-rjnjm\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.595662 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp6x9\" (UniqueName: \"kubernetes.io/projected/1834c92c-87c2-44ab-acda-1170f3a92303-kube-api-access-qp6x9\") pod \"redhat-marketplace-rjnjm\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.698099 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-catalog-content\") pod \"redhat-marketplace-rjnjm\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.698200 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp6x9\" (UniqueName: \"kubernetes.io/projected/1834c92c-87c2-44ab-acda-1170f3a92303-kube-api-access-qp6x9\") pod \"redhat-marketplace-rjnjm\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.698305 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-utilities\") pod \"redhat-marketplace-rjnjm\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.699095 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-catalog-content\") pod \"redhat-marketplace-rjnjm\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.699352 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-utilities\") pod \"redhat-marketplace-rjnjm\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.716486 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp6x9\" (UniqueName: \"kubernetes.io/projected/1834c92c-87c2-44ab-acda-1170f3a92303-kube-api-access-qp6x9\") pod \"redhat-marketplace-rjnjm\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.717214 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" event={"ID":"b51489f6-90e0-4a0d-ae54-24eb1e6f5568","Type":"ContainerDied","Data":"86af48b9e83d7d95adbe2df6fe8802abce22c90db515ffce0069648800fa485e"} Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.717303 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86af48b9e83d7d95adbe2df6fe8802abce22c90db515ffce0069648800fa485e" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.717287 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.719244 4722 generic.go:334] "Generic (PLEG): container finished" podID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerID="63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2" exitCode=0 Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.719278 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8p2td" event={"ID":"b4b45dc0-a6d0-4572-be7f-93dc70be0a17","Type":"ContainerDied","Data":"63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2"} Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.807201 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82"] Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.808943 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.811632 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.811784 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.811897 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.820133 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82"] Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.822268 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:49:59 crc kubenswrapper[4722]: I0219 19:49:59.833360 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.141234 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zmp82\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.141273 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxf9n\" (UniqueName: \"kubernetes.io/projected/fa0d4605-cd87-49b1-b17f-8c0e06590afd-kube-api-access-kxf9n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zmp82\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.141383 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zmp82\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.242903 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zmp82\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.243088 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zmp82\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.243113 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxf9n\" (UniqueName: \"kubernetes.io/projected/fa0d4605-cd87-49b1-b17f-8c0e06590afd-kube-api-access-kxf9n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zmp82\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.251139 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zmp82\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.252463 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zmp82\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.265931 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxf9n\" (UniqueName: \"kubernetes.io/projected/fa0d4605-cd87-49b1-b17f-8c0e06590afd-kube-api-access-kxf9n\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zmp82\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.348110 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.643630 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjnjm"] Feb 19 19:50:00 crc kubenswrapper[4722]: I0219 19:50:00.729225 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjnjm" event={"ID":"1834c92c-87c2-44ab-acda-1170f3a92303","Type":"ContainerStarted","Data":"5994d3f94bd355ac82deb7f5cfe2381af3489b711b13ce6948c137e7c6fdadf4"} Feb 19 19:50:01 crc kubenswrapper[4722]: I0219 19:50:01.247069 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82"] Feb 19 19:50:01 crc kubenswrapper[4722]: I0219 19:50:01.739582 4722 generic.go:334] "Generic (PLEG): container finished" podID="1834c92c-87c2-44ab-acda-1170f3a92303" containerID="f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e" exitCode=0 Feb 19 19:50:01 crc kubenswrapper[4722]: I0219 19:50:01.739628 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjnjm" event={"ID":"1834c92c-87c2-44ab-acda-1170f3a92303","Type":"ContainerDied","Data":"f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e"} Feb 19 19:50:01 crc kubenswrapper[4722]: I0219 19:50:01.741667 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" event={"ID":"fa0d4605-cd87-49b1-b17f-8c0e06590afd","Type":"ContainerStarted","Data":"bdc4d0bdb826bb0e1954e08f60e022a246a6429f3743c4c5ecdb3bc0104f4b0e"} Feb 19 19:50:03 crc kubenswrapper[4722]: I0219 19:50:03.720437 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8p2td" event={"ID":"b4b45dc0-a6d0-4572-be7f-93dc70be0a17","Type":"ContainerStarted","Data":"ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf"} Feb 19 19:50:04 crc kubenswrapper[4722]: I0219 19:50:04.730886 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjnjm" event={"ID":"1834c92c-87c2-44ab-acda-1170f3a92303","Type":"ContainerStarted","Data":"456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c"} Feb 19 19:50:04 crc kubenswrapper[4722]: I0219 19:50:04.732407 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" event={"ID":"fa0d4605-cd87-49b1-b17f-8c0e06590afd","Type":"ContainerStarted","Data":"8ef5eb3908214f96f5d6505146ed47b1529675834981d6c6dfbef8f12a789667"} Feb 19 19:50:04 crc kubenswrapper[4722]: I0219 19:50:04.882567 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" podStartSLOduration=3.291077917 podStartE2EDuration="5.882536618s" podCreationTimestamp="2026-02-19 19:49:59 +0000 UTC" firstStartedPulling="2026-02-19 19:50:01.237614989 +0000 UTC m=+1900.849965313" lastFinishedPulling="2026-02-19 19:50:03.82907369 +0000 UTC m=+1903.441424014" observedRunningTime="2026-02-19 19:50:04.878964376 +0000 UTC m=+1904.491314700" watchObservedRunningTime="2026-02-19 19:50:04.882536618 +0000 UTC m=+1904.494886942" Feb 19 19:50:06 crc kubenswrapper[4722]: I0219 19:50:06.071796 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:50:06 crc kubenswrapper[4722]: E0219 19:50:06.072392 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:50:08 crc kubenswrapper[4722]: I0219 19:50:08.773732 4722 generic.go:334] "Generic (PLEG): container finished" podID="1834c92c-87c2-44ab-acda-1170f3a92303" containerID="456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c" exitCode=0 Feb 19 19:50:08 crc kubenswrapper[4722]: I0219 19:50:08.773906 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjnjm" event={"ID":"1834c92c-87c2-44ab-acda-1170f3a92303","Type":"ContainerDied","Data":"456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c"} Feb 19 19:50:10 crc kubenswrapper[4722]: E0219 19:50:10.748746 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4b45dc0_a6d0_4572_be7f_93dc70be0a17.slice/crio-conmon-ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf.scope\": RecentStats: unable to find data in memory cache]" Feb 19 19:50:10 crc kubenswrapper[4722]: I0219 19:50:10.793971 4722 generic.go:334] "Generic (PLEG): container finished" podID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerID="ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf" exitCode=0 Feb 19 19:50:10 crc kubenswrapper[4722]: I0219 19:50:10.794021 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8p2td" event={"ID":"b4b45dc0-a6d0-4572-be7f-93dc70be0a17","Type":"ContainerDied","Data":"ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf"} Feb 19 19:50:11 crc kubenswrapper[4722]: I0219 19:50:11.045822 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-4fzxz"] Feb 19 19:50:11 crc kubenswrapper[4722]: I0219 19:50:11.061719 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-nq58z"] Feb 19 19:50:11 crc kubenswrapper[4722]: I0219 19:50:11.089472 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-4fzxz"] Feb 19 19:50:11 crc kubenswrapper[4722]: I0219 19:50:11.089514 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-l92p9"] Feb 19 19:50:11 crc kubenswrapper[4722]: I0219 19:50:11.095824 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-nq58z"] Feb 19 19:50:11 crc kubenswrapper[4722]: I0219 19:50:11.105379 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-l92p9"] Feb 19 19:50:12 crc kubenswrapper[4722]: I0219 19:50:12.035439 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-bc61-account-create-update-km828"] Feb 19 19:50:12 crc kubenswrapper[4722]: I0219 19:50:12.048020 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-cda6-account-create-update-45ddh"] Feb 19 19:50:12 crc kubenswrapper[4722]: I0219 19:50:12.059969 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-b095-account-create-update-d2ffx"] Feb 19 19:50:12 crc kubenswrapper[4722]: I0219 19:50:12.072231 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-bc61-account-create-update-km828"] Feb 19 19:50:12 crc kubenswrapper[4722]: I0219 19:50:12.082487 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-cda6-account-create-update-45ddh"] Feb 19 19:50:12 crc kubenswrapper[4722]: I0219 19:50:12.091884 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-b095-account-create-update-d2ffx"] Feb 19 19:50:12 crc kubenswrapper[4722]: I0219 19:50:12.816734 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjnjm" event={"ID":"1834c92c-87c2-44ab-acda-1170f3a92303","Type":"ContainerStarted","Data":"910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03"} Feb 19 19:50:12 crc kubenswrapper[4722]: I0219 19:50:12.819954 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8p2td" event={"ID":"b4b45dc0-a6d0-4572-be7f-93dc70be0a17","Type":"ContainerStarted","Data":"71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c"} Feb 19 19:50:12 crc kubenswrapper[4722]: I0219 19:50:12.846052 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rjnjm" podStartSLOduration=3.1951831139999998 podStartE2EDuration="13.846030867s" podCreationTimestamp="2026-02-19 19:49:59 +0000 UTC" firstStartedPulling="2026-02-19 19:50:01.743078179 +0000 UTC m=+1901.355428503" lastFinishedPulling="2026-02-19 19:50:12.393925932 +0000 UTC m=+1912.006276256" observedRunningTime="2026-02-19 19:50:12.839181853 +0000 UTC m=+1912.451532177" watchObservedRunningTime="2026-02-19 19:50:12.846030867 +0000 UTC m=+1912.458381201" Feb 19 19:50:12 crc kubenswrapper[4722]: I0219 19:50:12.866654 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8p2td" podStartSLOduration=3.032853581 podStartE2EDuration="15.866631269s" podCreationTimestamp="2026-02-19 19:49:57 +0000 UTC" firstStartedPulling="2026-02-19 19:49:59.72112044 +0000 UTC m=+1899.333470764" lastFinishedPulling="2026-02-19 19:50:12.554898138 +0000 UTC m=+1912.167248452" observedRunningTime="2026-02-19 19:50:12.855919085 +0000 UTC m=+1912.468269429" watchObservedRunningTime="2026-02-19 19:50:12.866631269 +0000 UTC m=+1912.478981603" Feb 19 19:50:13 crc kubenswrapper[4722]: I0219 19:50:13.083742 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f262eb9-64a7-4b10-85f9-4bc43d512f60" path="/var/lib/kubelet/pods/3f262eb9-64a7-4b10-85f9-4bc43d512f60/volumes" Feb 19 19:50:13 crc kubenswrapper[4722]: I0219 19:50:13.084432 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b8ebb77-caea-46ca-8989-d2dd37bf2df5" path="/var/lib/kubelet/pods/5b8ebb77-caea-46ca-8989-d2dd37bf2df5/volumes" Feb 19 19:50:13 crc kubenswrapper[4722]: I0219 19:50:13.085060 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a72e03c-87f6-4d54-8ea1-f8abed33bd2c" path="/var/lib/kubelet/pods/6a72e03c-87f6-4d54-8ea1-f8abed33bd2c/volumes" Feb 19 19:50:13 crc kubenswrapper[4722]: I0219 19:50:13.085709 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="823fc346-84d0-4920-bc42-ec213d0c6eef" path="/var/lib/kubelet/pods/823fc346-84d0-4920-bc42-ec213d0c6eef/volumes" Feb 19 19:50:13 crc kubenswrapper[4722]: I0219 19:50:13.086967 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84699ef3-8d21-4493-8875-81de167ee617" path="/var/lib/kubelet/pods/84699ef3-8d21-4493-8875-81de167ee617/volumes" Feb 19 19:50:13 crc kubenswrapper[4722]: I0219 19:50:13.087615 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6e27062-a94f-4d8d-8a07-b940d9aa572e" path="/var/lib/kubelet/pods/c6e27062-a94f-4d8d-8a07-b940d9aa572e/volumes" Feb 19 19:50:18 crc kubenswrapper[4722]: I0219 19:50:18.022804 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:50:18 crc kubenswrapper[4722]: I0219 19:50:18.023465 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:50:18 crc kubenswrapper[4722]: I0219 19:50:18.071256 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:50:18 crc kubenswrapper[4722]: I0219 19:50:18.957101 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:50:19 crc kubenswrapper[4722]: I0219 19:50:19.024065 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8p2td"] Feb 19 19:50:19 crc kubenswrapper[4722]: I0219 19:50:19.824586 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:50:19 crc kubenswrapper[4722]: I0219 19:50:19.824672 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:50:19 crc kubenswrapper[4722]: I0219 19:50:19.873800 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:50:19 crc kubenswrapper[4722]: I0219 19:50:19.926220 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:50:20 crc kubenswrapper[4722]: I0219 19:50:20.071507 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:50:20 crc kubenswrapper[4722]: E0219 19:50:20.071749 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:50:20 crc kubenswrapper[4722]: I0219 19:50:20.707022 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjnjm"] Feb 19 19:50:20 crc kubenswrapper[4722]: I0219 19:50:20.895846 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8p2td" podUID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerName="registry-server" containerID="cri-o://71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c" gracePeriod=2 Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.459045 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.556023 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-catalog-content\") pod \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.556229 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-utilities\") pod \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.556291 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wszsv\" (UniqueName: \"kubernetes.io/projected/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-kube-api-access-wszsv\") pod \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\" (UID: \"b4b45dc0-a6d0-4572-be7f-93dc70be0a17\") " Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.557202 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-utilities" (OuterVolumeSpecName: "utilities") pod "b4b45dc0-a6d0-4572-be7f-93dc70be0a17" (UID: "b4b45dc0-a6d0-4572-be7f-93dc70be0a17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.561611 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-kube-api-access-wszsv" (OuterVolumeSpecName: "kube-api-access-wszsv") pod "b4b45dc0-a6d0-4572-be7f-93dc70be0a17" (UID: "b4b45dc0-a6d0-4572-be7f-93dc70be0a17"). InnerVolumeSpecName "kube-api-access-wszsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.609284 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4b45dc0-a6d0-4572-be7f-93dc70be0a17" (UID: "b4b45dc0-a6d0-4572-be7f-93dc70be0a17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.661642 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.661686 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wszsv\" (UniqueName: \"kubernetes.io/projected/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-kube-api-access-wszsv\") on node \"crc\" DevicePath \"\"" Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.661703 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4b45dc0-a6d0-4572-be7f-93dc70be0a17-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.905969 4722 generic.go:334] "Generic (PLEG): container finished" podID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerID="71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c" exitCode=0 Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.906027 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8p2td" Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.906039 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8p2td" event={"ID":"b4b45dc0-a6d0-4572-be7f-93dc70be0a17","Type":"ContainerDied","Data":"71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c"} Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.906084 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8p2td" event={"ID":"b4b45dc0-a6d0-4572-be7f-93dc70be0a17","Type":"ContainerDied","Data":"fc6bc749ec8e5faa86281dd9afaa400f15102b634bc38df4e83908220e9f8b43"} Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.906102 4722 scope.go:117] "RemoveContainer" containerID="71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c" Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.906442 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rjnjm" podUID="1834c92c-87c2-44ab-acda-1170f3a92303" containerName="registry-server" containerID="cri-o://910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03" gracePeriod=2 Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.937387 4722 scope.go:117] "RemoveContainer" containerID="ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf" Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.939429 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8p2td"] Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.952770 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8p2td"] Feb 19 19:50:21 crc kubenswrapper[4722]: I0219 19:50:21.961801 4722 scope.go:117] "RemoveContainer" containerID="63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.116414 4722 scope.go:117] "RemoveContainer" containerID="71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c" Feb 19 19:50:22 crc kubenswrapper[4722]: E0219 19:50:22.124396 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c\": container with ID starting with 71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c not found: ID does not exist" containerID="71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.124467 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c"} err="failed to get container status \"71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c\": rpc error: code = NotFound desc = could not find container \"71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c\": container with ID starting with 71028bf464e6b16c36d87a4c1c0138143d4f0eea036c2eb25dc3eae3238f0d0c not found: ID does not exist" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.124501 4722 scope.go:117] "RemoveContainer" containerID="ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf" Feb 19 19:50:22 crc kubenswrapper[4722]: E0219 19:50:22.125094 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf\": container with ID starting with ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf not found: ID does not exist" containerID="ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.125170 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf"} err="failed to get container status \"ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf\": rpc error: code = NotFound desc = could not find container \"ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf\": container with ID starting with ba3fa937e341114c5cc68ec5bf689427ee7d95565242f727b47f994906268baf not found: ID does not exist" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.125204 4722 scope.go:117] "RemoveContainer" containerID="63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2" Feb 19 19:50:22 crc kubenswrapper[4722]: E0219 19:50:22.125568 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2\": container with ID starting with 63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2 not found: ID does not exist" containerID="63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.125598 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2"} err="failed to get container status \"63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2\": rpc error: code = NotFound desc = could not find container \"63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2\": container with ID starting with 63aa68b2c95ca035345256a32fef21039110ac782031b98cdc298a53dfd1f8f2 not found: ID does not exist" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.431112 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.477881 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-catalog-content\") pod \"1834c92c-87c2-44ab-acda-1170f3a92303\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.477937 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp6x9\" (UniqueName: \"kubernetes.io/projected/1834c92c-87c2-44ab-acda-1170f3a92303-kube-api-access-qp6x9\") pod \"1834c92c-87c2-44ab-acda-1170f3a92303\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.478143 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-utilities\") pod \"1834c92c-87c2-44ab-acda-1170f3a92303\" (UID: \"1834c92c-87c2-44ab-acda-1170f3a92303\") " Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.478942 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-utilities" (OuterVolumeSpecName: "utilities") pod "1834c92c-87c2-44ab-acda-1170f3a92303" (UID: "1834c92c-87c2-44ab-acda-1170f3a92303"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.484107 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1834c92c-87c2-44ab-acda-1170f3a92303-kube-api-access-qp6x9" (OuterVolumeSpecName: "kube-api-access-qp6x9") pod "1834c92c-87c2-44ab-acda-1170f3a92303" (UID: "1834c92c-87c2-44ab-acda-1170f3a92303"). InnerVolumeSpecName "kube-api-access-qp6x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.502896 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1834c92c-87c2-44ab-acda-1170f3a92303" (UID: "1834c92c-87c2-44ab-acda-1170f3a92303"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.581350 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.581390 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp6x9\" (UniqueName: \"kubernetes.io/projected/1834c92c-87c2-44ab-acda-1170f3a92303-kube-api-access-qp6x9\") on node \"crc\" DevicePath \"\"" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.581406 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1834c92c-87c2-44ab-acda-1170f3a92303-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.919395 4722 generic.go:334] "Generic (PLEG): container finished" podID="1834c92c-87c2-44ab-acda-1170f3a92303" containerID="910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03" exitCode=0 Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.919486 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjnjm" event={"ID":"1834c92c-87c2-44ab-acda-1170f3a92303","Type":"ContainerDied","Data":"910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03"} Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.919890 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rjnjm" event={"ID":"1834c92c-87c2-44ab-acda-1170f3a92303","Type":"ContainerDied","Data":"5994d3f94bd355ac82deb7f5cfe2381af3489b711b13ce6948c137e7c6fdadf4"} Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.919925 4722 scope.go:117] "RemoveContainer" containerID="910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.919520 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rjnjm" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.947224 4722 scope.go:117] "RemoveContainer" containerID="456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.958138 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjnjm"] Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.969121 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rjnjm"] Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.970573 4722 scope.go:117] "RemoveContainer" containerID="f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.995278 4722 scope.go:117] "RemoveContainer" containerID="910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03" Feb 19 19:50:22 crc kubenswrapper[4722]: E0219 19:50:22.995777 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03\": container with ID starting with 910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03 not found: ID does not exist" containerID="910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.995819 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03"} err="failed to get container status \"910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03\": rpc error: code = NotFound desc = could not find container \"910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03\": container with ID starting with 910dd1439f0155e42b5b5bbce6914dc260c4435efb8a88d142d8c1a43da7ab03 not found: ID does not exist" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.995846 4722 scope.go:117] "RemoveContainer" containerID="456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c" Feb 19 19:50:22 crc kubenswrapper[4722]: E0219 19:50:22.996190 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c\": container with ID starting with 456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c not found: ID does not exist" containerID="456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.996231 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c"} err="failed to get container status \"456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c\": rpc error: code = NotFound desc = could not find container \"456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c\": container with ID starting with 456db96d14fc78968d727e7fe8cd9a30e0761750b4f374ef5d42a5b0dd335f5c not found: ID does not exist" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.996257 4722 scope.go:117] "RemoveContainer" containerID="f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e" Feb 19 19:50:22 crc kubenswrapper[4722]: E0219 19:50:22.996774 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e\": container with ID starting with f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e not found: ID does not exist" containerID="f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e" Feb 19 19:50:22 crc kubenswrapper[4722]: I0219 19:50:22.996801 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e"} err="failed to get container status \"f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e\": rpc error: code = NotFound desc = could not find container \"f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e\": container with ID starting with f9fffa8693fff48e3101bd5401cd78629fcd666ae72ffcfcdfe2c33bdce69b8e not found: ID does not exist" Feb 19 19:50:23 crc kubenswrapper[4722]: I0219 19:50:23.084850 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1834c92c-87c2-44ab-acda-1170f3a92303" path="/var/lib/kubelet/pods/1834c92c-87c2-44ab-acda-1170f3a92303/volumes" Feb 19 19:50:23 crc kubenswrapper[4722]: I0219 19:50:23.085860 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" path="/var/lib/kubelet/pods/b4b45dc0-a6d0-4572-be7f-93dc70be0a17/volumes" Feb 19 19:50:34 crc kubenswrapper[4722]: I0219 19:50:34.071808 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:50:34 crc kubenswrapper[4722]: E0219 19:50:34.072816 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:50:38 crc kubenswrapper[4722]: I0219 19:50:38.036830 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cddxh"] Feb 19 19:50:38 crc kubenswrapper[4722]: I0219 19:50:38.045381 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cddxh"] Feb 19 19:50:39 crc kubenswrapper[4722]: I0219 19:50:39.337519 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2859f56-714b-43b5-bb67-6ee5493d4f11" path="/var/lib/kubelet/pods/e2859f56-714b-43b5-bb67-6ee5493d4f11/volumes" Feb 19 19:50:40 crc kubenswrapper[4722]: I0219 19:50:40.342578 4722 generic.go:334] "Generic (PLEG): container finished" podID="fa0d4605-cd87-49b1-b17f-8c0e06590afd" containerID="8ef5eb3908214f96f5d6505146ed47b1529675834981d6c6dfbef8f12a789667" exitCode=0 Feb 19 19:50:40 crc kubenswrapper[4722]: I0219 19:50:40.342654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" event={"ID":"fa0d4605-cd87-49b1-b17f-8c0e06590afd","Type":"ContainerDied","Data":"8ef5eb3908214f96f5d6505146ed47b1529675834981d6c6dfbef8f12a789667"} Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.283400 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.428713 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxf9n\" (UniqueName: \"kubernetes.io/projected/fa0d4605-cd87-49b1-b17f-8c0e06590afd-kube-api-access-kxf9n\") pod \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.428863 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-inventory\") pod \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.430291 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-ssh-key-openstack-edpm-ipam\") pod \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\" (UID: \"fa0d4605-cd87-49b1-b17f-8c0e06590afd\") " Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.433632 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa0d4605-cd87-49b1-b17f-8c0e06590afd-kube-api-access-kxf9n" (OuterVolumeSpecName: "kube-api-access-kxf9n") pod "fa0d4605-cd87-49b1-b17f-8c0e06590afd" (UID: "fa0d4605-cd87-49b1-b17f-8c0e06590afd"). InnerVolumeSpecName "kube-api-access-kxf9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.454413 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-inventory" (OuterVolumeSpecName: "inventory") pod "fa0d4605-cd87-49b1-b17f-8c0e06590afd" (UID: "fa0d4605-cd87-49b1-b17f-8c0e06590afd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.467261 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fa0d4605-cd87-49b1-b17f-8c0e06590afd" (UID: "fa0d4605-cd87-49b1-b17f-8c0e06590afd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.533561 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.533592 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxf9n\" (UniqueName: \"kubernetes.io/projected/fa0d4605-cd87-49b1-b17f-8c0e06590afd-kube-api-access-kxf9n\") on node \"crc\" DevicePath \"\"" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.533603 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fa0d4605-cd87-49b1-b17f-8c0e06590afd-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.765181 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" event={"ID":"fa0d4605-cd87-49b1-b17f-8c0e06590afd","Type":"ContainerDied","Data":"bdc4d0bdb826bb0e1954e08f60e022a246a6429f3743c4c5ecdb3bc0104f4b0e"} Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.765230 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdc4d0bdb826bb0e1954e08f60e022a246a6429f3743c4c5ecdb3bc0104f4b0e" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.765299 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zmp82" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817260 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t"] Feb 19 19:50:42 crc kubenswrapper[4722]: E0219 19:50:42.817643 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa0d4605-cd87-49b1-b17f-8c0e06590afd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817659 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa0d4605-cd87-49b1-b17f-8c0e06590afd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:50:42 crc kubenswrapper[4722]: E0219 19:50:42.817676 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerName="registry-server" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817685 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerName="registry-server" Feb 19 19:50:42 crc kubenswrapper[4722]: E0219 19:50:42.817696 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1834c92c-87c2-44ab-acda-1170f3a92303" containerName="extract-content" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817703 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1834c92c-87c2-44ab-acda-1170f3a92303" containerName="extract-content" Feb 19 19:50:42 crc kubenswrapper[4722]: E0219 19:50:42.817715 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerName="extract-content" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817720 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerName="extract-content" Feb 19 19:50:42 crc kubenswrapper[4722]: E0219 19:50:42.817731 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1834c92c-87c2-44ab-acda-1170f3a92303" containerName="registry-server" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817736 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1834c92c-87c2-44ab-acda-1170f3a92303" containerName="registry-server" Feb 19 19:50:42 crc kubenswrapper[4722]: E0219 19:50:42.817749 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1834c92c-87c2-44ab-acda-1170f3a92303" containerName="extract-utilities" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817755 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1834c92c-87c2-44ab-acda-1170f3a92303" containerName="extract-utilities" Feb 19 19:50:42 crc kubenswrapper[4722]: E0219 19:50:42.817772 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerName="extract-utilities" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817777 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerName="extract-utilities" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817945 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1834c92c-87c2-44ab-acda-1170f3a92303" containerName="registry-server" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817957 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa0d4605-cd87-49b1-b17f-8c0e06590afd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.817983 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b45dc0-a6d0-4572-be7f-93dc70be0a17" containerName="registry-server" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.818635 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.820915 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.821107 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.821843 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.822661 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.835388 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t"] Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.942189 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhmqn\" (UniqueName: \"kubernetes.io/projected/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-kube-api-access-jhmqn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.942764 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:42 crc kubenswrapper[4722]: I0219 19:50:42.943049 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:43 crc kubenswrapper[4722]: I0219 19:50:43.318027 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:43 crc kubenswrapper[4722]: I0219 19:50:43.318396 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhmqn\" (UniqueName: \"kubernetes.io/projected/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-kube-api-access-jhmqn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:43 crc kubenswrapper[4722]: I0219 19:50:43.318488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:43 crc kubenswrapper[4722]: I0219 19:50:43.330081 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:43 crc kubenswrapper[4722]: I0219 19:50:43.332861 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:43 crc kubenswrapper[4722]: I0219 19:50:43.353701 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhmqn\" (UniqueName: \"kubernetes.io/projected/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-kube-api-access-jhmqn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:43 crc kubenswrapper[4722]: I0219 19:50:43.441245 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:50:44 crc kubenswrapper[4722]: I0219 19:50:44.013769 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t"] Feb 19 19:50:44 crc kubenswrapper[4722]: I0219 19:50:44.784670 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" event={"ID":"7cf0842e-58ac-4cd1-b26f-9fc131177aa9","Type":"ContainerStarted","Data":"ed094b50ce9e4b6fb2a00e489a57b8c4478f4a3da5d6d489d485b87719753af5"} Feb 19 19:50:45 crc kubenswrapper[4722]: I0219 19:50:45.223670 4722 scope.go:117] "RemoveContainer" containerID="bcf1c97e5c8d595576441c4adc1b3414c50f70e142078a59a013a524b3fc5783" Feb 19 19:50:45 crc kubenswrapper[4722]: I0219 19:50:45.252586 4722 scope.go:117] "RemoveContainer" containerID="0c662d869f0260b21b14e815b1c26ef3d995bd4318e89a8c8d85dd5703eaa89e" Feb 19 19:50:45 crc kubenswrapper[4722]: I0219 19:50:45.288966 4722 scope.go:117] "RemoveContainer" containerID="02fdf9891e0a4a5e6a9cd6279f1ac5170d3eaad2e2904682a600a6d410fb2a19" Feb 19 19:50:45 crc kubenswrapper[4722]: I0219 19:50:45.337431 4722 scope.go:117] "RemoveContainer" containerID="59b7ab3b9b5c89b55e17c8616e639ea24cc02e1ca89d3d887ff255092c310b2a" Feb 19 19:50:45 crc kubenswrapper[4722]: I0219 19:50:45.356624 4722 scope.go:117] "RemoveContainer" containerID="34ce6fe937d88e617e83f04f4163bf9713e6cac4114d5734077d30be33461dbc" Feb 19 19:50:45 crc kubenswrapper[4722]: I0219 19:50:45.376874 4722 scope.go:117] "RemoveContainer" containerID="0edcf275740c511d92faf25dcc6aa827af0e172da4743fa7292bb01babbbeb7e" Feb 19 19:50:45 crc kubenswrapper[4722]: I0219 19:50:45.427864 4722 scope.go:117] "RemoveContainer" containerID="a8d182f3ca75056fc67eb781e9901b0b7fa4501055d209f0f02c035090c589a3" Feb 19 19:50:45 crc kubenswrapper[4722]: I0219 19:50:45.795747 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" event={"ID":"7cf0842e-58ac-4cd1-b26f-9fc131177aa9","Type":"ContainerStarted","Data":"c91f7b8f75e6df2eee89ce0b127406ba6281160fc85e04bd9c2b6a2e9c76dae0"} Feb 19 19:50:45 crc kubenswrapper[4722]: I0219 19:50:45.827940 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" podStartSLOduration=3.166611313 podStartE2EDuration="3.827920098s" podCreationTimestamp="2026-02-19 19:50:42 +0000 UTC" firstStartedPulling="2026-02-19 19:50:44.020753796 +0000 UTC m=+1943.633104120" lastFinishedPulling="2026-02-19 19:50:44.682062571 +0000 UTC m=+1944.294412905" observedRunningTime="2026-02-19 19:50:45.814266581 +0000 UTC m=+1945.426616905" watchObservedRunningTime="2026-02-19 19:50:45.827920098 +0000 UTC m=+1945.440270422" Feb 19 19:50:49 crc kubenswrapper[4722]: I0219 19:50:49.071571 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:50:49 crc kubenswrapper[4722]: E0219 19:50:49.072291 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:51:01 crc kubenswrapper[4722]: I0219 19:51:01.048421 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-dzq9w"] Feb 19 19:51:01 crc kubenswrapper[4722]: I0219 19:51:01.057581 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-dzq9w"] Feb 19 19:51:01 crc kubenswrapper[4722]: I0219 19:51:01.083451 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1a230c6-6844-4483-a8b4-0ae8073dff8d" path="/var/lib/kubelet/pods/d1a230c6-6844-4483-a8b4-0ae8073dff8d/volumes" Feb 19 19:51:03 crc kubenswrapper[4722]: I0219 19:51:03.072057 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:51:03 crc kubenswrapper[4722]: E0219 19:51:03.072753 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:51:07 crc kubenswrapper[4722]: I0219 19:51:07.032786 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2nnkf"] Feb 19 19:51:07 crc kubenswrapper[4722]: I0219 19:51:07.048927 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2nnkf"] Feb 19 19:51:07 crc kubenswrapper[4722]: I0219 19:51:07.087795 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="106da00f-55de-4b4f-8a57-b8f0b1994c2f" path="/var/lib/kubelet/pods/106da00f-55de-4b4f-8a57-b8f0b1994c2f/volumes" Feb 19 19:51:15 crc kubenswrapper[4722]: I0219 19:51:15.071119 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:51:16 crc kubenswrapper[4722]: I0219 19:51:16.117830 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"2cd16aaeb87a475b93fc788beab87d96c07079ee8a02a2b8bfaa32d70b168fef"} Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.238442 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p59r9"] Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.243635 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.254614 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p59r9"] Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.332751 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2tt9\" (UniqueName: \"kubernetes.io/projected/b2244865-e076-45b5-9bd9-d639d96d6ffe-kube-api-access-r2tt9\") pod \"redhat-operators-p59r9\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.332982 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-catalog-content\") pod \"redhat-operators-p59r9\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.333410 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-utilities\") pod \"redhat-operators-p59r9\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.435187 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-catalog-content\") pod \"redhat-operators-p59r9\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.435355 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-utilities\") pod \"redhat-operators-p59r9\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.435807 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-catalog-content\") pod \"redhat-operators-p59r9\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.435843 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-utilities\") pod \"redhat-operators-p59r9\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.435986 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2tt9\" (UniqueName: \"kubernetes.io/projected/b2244865-e076-45b5-9bd9-d639d96d6ffe-kube-api-access-r2tt9\") pod \"redhat-operators-p59r9\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.458679 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2tt9\" (UniqueName: \"kubernetes.io/projected/b2244865-e076-45b5-9bd9-d639d96d6ffe-kube-api-access-r2tt9\") pod \"redhat-operators-p59r9\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:21 crc kubenswrapper[4722]: I0219 19:51:21.566826 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:22 crc kubenswrapper[4722]: I0219 19:51:22.091298 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p59r9"] Feb 19 19:51:22 crc kubenswrapper[4722]: I0219 19:51:22.193023 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p59r9" event={"ID":"b2244865-e076-45b5-9bd9-d639d96d6ffe","Type":"ContainerStarted","Data":"99ef2afc5d68e5988babf1bade224e9fb79a3934fe1cd919c9eb1f3329445aa3"} Feb 19 19:51:23 crc kubenswrapper[4722]: I0219 19:51:23.205681 4722 generic.go:334] "Generic (PLEG): container finished" podID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerID="b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4" exitCode=0 Feb 19 19:51:23 crc kubenswrapper[4722]: I0219 19:51:23.205764 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p59r9" event={"ID":"b2244865-e076-45b5-9bd9-d639d96d6ffe","Type":"ContainerDied","Data":"b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4"} Feb 19 19:51:24 crc kubenswrapper[4722]: I0219 19:51:24.218718 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p59r9" event={"ID":"b2244865-e076-45b5-9bd9-d639d96d6ffe","Type":"ContainerStarted","Data":"b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d"} Feb 19 19:51:28 crc kubenswrapper[4722]: I0219 19:51:28.259362 4722 generic.go:334] "Generic (PLEG): container finished" podID="7cf0842e-58ac-4cd1-b26f-9fc131177aa9" containerID="c91f7b8f75e6df2eee89ce0b127406ba6281160fc85e04bd9c2b6a2e9c76dae0" exitCode=0 Feb 19 19:51:28 crc kubenswrapper[4722]: I0219 19:51:28.259469 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" event={"ID":"7cf0842e-58ac-4cd1-b26f-9fc131177aa9","Type":"ContainerDied","Data":"c91f7b8f75e6df2eee89ce0b127406ba6281160fc85e04bd9c2b6a2e9c76dae0"} Feb 19 19:51:29 crc kubenswrapper[4722]: I0219 19:51:29.775042 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:51:29 crc kubenswrapper[4722]: I0219 19:51:29.805416 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-inventory\") pod \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " Feb 19 19:51:29 crc kubenswrapper[4722]: I0219 19:51:29.805483 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhmqn\" (UniqueName: \"kubernetes.io/projected/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-kube-api-access-jhmqn\") pod \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " Feb 19 19:51:29 crc kubenswrapper[4722]: I0219 19:51:29.805821 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-ssh-key-openstack-edpm-ipam\") pod \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\" (UID: \"7cf0842e-58ac-4cd1-b26f-9fc131177aa9\") " Feb 19 19:51:29 crc kubenswrapper[4722]: I0219 19:51:29.813440 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-kube-api-access-jhmqn" (OuterVolumeSpecName: "kube-api-access-jhmqn") pod "7cf0842e-58ac-4cd1-b26f-9fc131177aa9" (UID: "7cf0842e-58ac-4cd1-b26f-9fc131177aa9"). InnerVolumeSpecName "kube-api-access-jhmqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:51:29 crc kubenswrapper[4722]: I0219 19:51:29.835490 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7cf0842e-58ac-4cd1-b26f-9fc131177aa9" (UID: "7cf0842e-58ac-4cd1-b26f-9fc131177aa9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:51:29 crc kubenswrapper[4722]: I0219 19:51:29.849399 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-inventory" (OuterVolumeSpecName: "inventory") pod "7cf0842e-58ac-4cd1-b26f-9fc131177aa9" (UID: "7cf0842e-58ac-4cd1-b26f-9fc131177aa9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:51:29 crc kubenswrapper[4722]: I0219 19:51:29.908241 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:29 crc kubenswrapper[4722]: I0219 19:51:29.908282 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:29 crc kubenswrapper[4722]: I0219 19:51:29.908295 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhmqn\" (UniqueName: \"kubernetes.io/projected/7cf0842e-58ac-4cd1-b26f-9fc131177aa9-kube-api-access-jhmqn\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.278010 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.278003 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t" event={"ID":"7cf0842e-58ac-4cd1-b26f-9fc131177aa9","Type":"ContainerDied","Data":"ed094b50ce9e4b6fb2a00e489a57b8c4478f4a3da5d6d489d485b87719753af5"} Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.278069 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed094b50ce9e4b6fb2a00e489a57b8c4478f4a3da5d6d489d485b87719753af5" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.280074 4722 generic.go:334] "Generic (PLEG): container finished" podID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerID="b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d" exitCode=0 Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.280108 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p59r9" event={"ID":"b2244865-e076-45b5-9bd9-d639d96d6ffe","Type":"ContainerDied","Data":"b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d"} Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.392488 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rr66z"] Feb 19 19:51:30 crc kubenswrapper[4722]: E0219 19:51:30.392898 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf0842e-58ac-4cd1-b26f-9fc131177aa9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.392922 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf0842e-58ac-4cd1-b26f-9fc131177aa9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.393571 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf0842e-58ac-4cd1-b26f-9fc131177aa9" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.394390 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.398223 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.398271 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.398426 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.400086 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.406817 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rr66z"] Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.425329 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rr66z\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.425433 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mr57\" (UniqueName: \"kubernetes.io/projected/812efe23-7ca7-49b9-bd76-194a82c603b3-kube-api-access-6mr57\") pod \"ssh-known-hosts-edpm-deployment-rr66z\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.425552 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rr66z\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.527487 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rr66z\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.527536 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rr66z\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.527608 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mr57\" (UniqueName: \"kubernetes.io/projected/812efe23-7ca7-49b9-bd76-194a82c603b3-kube-api-access-6mr57\") pod \"ssh-known-hosts-edpm-deployment-rr66z\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.531500 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-rr66z\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.532125 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-rr66z\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.549809 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mr57\" (UniqueName: \"kubernetes.io/projected/812efe23-7ca7-49b9-bd76-194a82c603b3-kube-api-access-6mr57\") pod \"ssh-known-hosts-edpm-deployment-rr66z\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:30 crc kubenswrapper[4722]: I0219 19:51:30.724783 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:31 crc kubenswrapper[4722]: I0219 19:51:31.844839 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-rr66z"] Feb 19 19:51:32 crc kubenswrapper[4722]: I0219 19:51:32.743793 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" event={"ID":"812efe23-7ca7-49b9-bd76-194a82c603b3","Type":"ContainerStarted","Data":"2df917e43a416f61f0915f579f37cf08041e0c25666ea46224e704144a375c11"} Feb 19 19:51:33 crc kubenswrapper[4722]: I0219 19:51:33.757034 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p59r9" event={"ID":"b2244865-e076-45b5-9bd9-d639d96d6ffe","Type":"ContainerStarted","Data":"5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2"} Feb 19 19:51:33 crc kubenswrapper[4722]: I0219 19:51:33.782817 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p59r9" podStartSLOduration=2.764545821 podStartE2EDuration="12.782796224s" podCreationTimestamp="2026-02-19 19:51:21 +0000 UTC" firstStartedPulling="2026-02-19 19:51:23.207563153 +0000 UTC m=+1982.819913477" lastFinishedPulling="2026-02-19 19:51:33.225813566 +0000 UTC m=+1992.838163880" observedRunningTime="2026-02-19 19:51:33.775629771 +0000 UTC m=+1993.387980115" watchObservedRunningTime="2026-02-19 19:51:33.782796224 +0000 UTC m=+1993.395146548" Feb 19 19:51:34 crc kubenswrapper[4722]: I0219 19:51:34.770743 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" event={"ID":"812efe23-7ca7-49b9-bd76-194a82c603b3","Type":"ContainerStarted","Data":"abe5af0f2de7ed90cae73238503c9bc96c0c732faee4f5f8a93b3f0a5fd43d62"} Feb 19 19:51:34 crc kubenswrapper[4722]: I0219 19:51:34.805608 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" podStartSLOduration=3.10742121 podStartE2EDuration="4.805580834s" podCreationTimestamp="2026-02-19 19:51:30 +0000 UTC" firstStartedPulling="2026-02-19 19:51:31.840989594 +0000 UTC m=+1991.453339918" lastFinishedPulling="2026-02-19 19:51:33.539149218 +0000 UTC m=+1993.151499542" observedRunningTime="2026-02-19 19:51:34.792950939 +0000 UTC m=+1994.405301263" watchObservedRunningTime="2026-02-19 19:51:34.805580834 +0000 UTC m=+1994.417931178" Feb 19 19:51:39 crc kubenswrapper[4722]: I0219 19:51:39.819800 4722 generic.go:334] "Generic (PLEG): container finished" podID="812efe23-7ca7-49b9-bd76-194a82c603b3" containerID="abe5af0f2de7ed90cae73238503c9bc96c0c732faee4f5f8a93b3f0a5fd43d62" exitCode=0 Feb 19 19:51:39 crc kubenswrapper[4722]: I0219 19:51:39.819901 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" event={"ID":"812efe23-7ca7-49b9-bd76-194a82c603b3","Type":"ContainerDied","Data":"abe5af0f2de7ed90cae73238503c9bc96c0c732faee4f5f8a93b3f0a5fd43d62"} Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.347987 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.526888 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-inventory-0\") pod \"812efe23-7ca7-49b9-bd76-194a82c603b3\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.527007 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mr57\" (UniqueName: \"kubernetes.io/projected/812efe23-7ca7-49b9-bd76-194a82c603b3-kube-api-access-6mr57\") pod \"812efe23-7ca7-49b9-bd76-194a82c603b3\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.527198 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-ssh-key-openstack-edpm-ipam\") pod \"812efe23-7ca7-49b9-bd76-194a82c603b3\" (UID: \"812efe23-7ca7-49b9-bd76-194a82c603b3\") " Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.533318 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/812efe23-7ca7-49b9-bd76-194a82c603b3-kube-api-access-6mr57" (OuterVolumeSpecName: "kube-api-access-6mr57") pod "812efe23-7ca7-49b9-bd76-194a82c603b3" (UID: "812efe23-7ca7-49b9-bd76-194a82c603b3"). InnerVolumeSpecName "kube-api-access-6mr57". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.557331 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "812efe23-7ca7-49b9-bd76-194a82c603b3" (UID: "812efe23-7ca7-49b9-bd76-194a82c603b3"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.558716 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "812efe23-7ca7-49b9-bd76-194a82c603b3" (UID: "812efe23-7ca7-49b9-bd76-194a82c603b3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.567070 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.567247 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.618852 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.629528 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.629567 4722 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/812efe23-7ca7-49b9-bd76-194a82c603b3-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.629581 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mr57\" (UniqueName: \"kubernetes.io/projected/812efe23-7ca7-49b9-bd76-194a82c603b3-kube-api-access-6mr57\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.843019 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" event={"ID":"812efe23-7ca7-49b9-bd76-194a82c603b3","Type":"ContainerDied","Data":"2df917e43a416f61f0915f579f37cf08041e0c25666ea46224e704144a375c11"} Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.843064 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2df917e43a416f61f0915f579f37cf08041e0c25666ea46224e704144a375c11" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.843033 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-rr66z" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.902311 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.916348 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k"] Feb 19 19:51:41 crc kubenswrapper[4722]: E0219 19:51:41.916842 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812efe23-7ca7-49b9-bd76-194a82c603b3" containerName="ssh-known-hosts-edpm-deployment" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.916858 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="812efe23-7ca7-49b9-bd76-194a82c603b3" containerName="ssh-known-hosts-edpm-deployment" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.917054 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="812efe23-7ca7-49b9-bd76-194a82c603b3" containerName="ssh-known-hosts-edpm-deployment" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.917876 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.922983 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.923305 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.923454 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.923595 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.931290 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k"] Feb 19 19:51:41 crc kubenswrapper[4722]: I0219 19:51:41.982255 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p59r9"] Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.038426 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6c9k\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.038572 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6c9k\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.038681 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwq8z\" (UniqueName: \"kubernetes.io/projected/44ab5cbe-e4cd-4036-8768-104fcf0d8963-kube-api-access-fwq8z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6c9k\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.140992 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6c9k\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.141393 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwq8z\" (UniqueName: \"kubernetes.io/projected/44ab5cbe-e4cd-4036-8768-104fcf0d8963-kube-api-access-fwq8z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6c9k\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.141555 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6c9k\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.145228 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6c9k\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.150729 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6c9k\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.165240 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwq8z\" (UniqueName: \"kubernetes.io/projected/44ab5cbe-e4cd-4036-8768-104fcf0d8963-kube-api-access-fwq8z\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-r6c9k\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.254353 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.776767 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k"] Feb 19 19:51:42 crc kubenswrapper[4722]: I0219 19:51:42.853189 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" event={"ID":"44ab5cbe-e4cd-4036-8768-104fcf0d8963","Type":"ContainerStarted","Data":"48509c603cbc50d92636f3b8fd5508c2be236b4b60cc1033e3fe5f5f28886ca6"} Feb 19 19:51:43 crc kubenswrapper[4722]: I0219 19:51:43.861839 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" event={"ID":"44ab5cbe-e4cd-4036-8768-104fcf0d8963","Type":"ContainerStarted","Data":"4a87c6050a72a549c80668d7d9b519552efbd08d93bc7244a2c305d766c13317"} Feb 19 19:51:43 crc kubenswrapper[4722]: I0219 19:51:43.862025 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p59r9" podUID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerName="registry-server" containerID="cri-o://5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2" gracePeriod=2 Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.452840 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.596787 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-utilities\") pod \"b2244865-e076-45b5-9bd9-d639d96d6ffe\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.596952 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-catalog-content\") pod \"b2244865-e076-45b5-9bd9-d639d96d6ffe\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.597043 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2tt9\" (UniqueName: \"kubernetes.io/projected/b2244865-e076-45b5-9bd9-d639d96d6ffe-kube-api-access-r2tt9\") pod \"b2244865-e076-45b5-9bd9-d639d96d6ffe\" (UID: \"b2244865-e076-45b5-9bd9-d639d96d6ffe\") " Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.597617 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-utilities" (OuterVolumeSpecName: "utilities") pod "b2244865-e076-45b5-9bd9-d639d96d6ffe" (UID: "b2244865-e076-45b5-9bd9-d639d96d6ffe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.597985 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.602708 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2244865-e076-45b5-9bd9-d639d96d6ffe-kube-api-access-r2tt9" (OuterVolumeSpecName: "kube-api-access-r2tt9") pod "b2244865-e076-45b5-9bd9-d639d96d6ffe" (UID: "b2244865-e076-45b5-9bd9-d639d96d6ffe"). InnerVolumeSpecName "kube-api-access-r2tt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.700334 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2tt9\" (UniqueName: \"kubernetes.io/projected/b2244865-e076-45b5-9bd9-d639d96d6ffe-kube-api-access-r2tt9\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.720142 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2244865-e076-45b5-9bd9-d639d96d6ffe" (UID: "b2244865-e076-45b5-9bd9-d639d96d6ffe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.802432 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2244865-e076-45b5-9bd9-d639d96d6ffe-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.875651 4722 generic.go:334] "Generic (PLEG): container finished" podID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerID="5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2" exitCode=0 Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.875849 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p59r9" event={"ID":"b2244865-e076-45b5-9bd9-d639d96d6ffe","Type":"ContainerDied","Data":"5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2"} Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.875886 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p59r9" event={"ID":"b2244865-e076-45b5-9bd9-d639d96d6ffe","Type":"ContainerDied","Data":"99ef2afc5d68e5988babf1bade224e9fb79a3934fe1cd919c9eb1f3329445aa3"} Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.875905 4722 scope.go:117] "RemoveContainer" containerID="5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.876054 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p59r9" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.900391 4722 scope.go:117] "RemoveContainer" containerID="b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.914145 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" podStartSLOduration=3.102753357 podStartE2EDuration="3.914111557s" podCreationTimestamp="2026-02-19 19:51:41 +0000 UTC" firstStartedPulling="2026-02-19 19:51:42.782746889 +0000 UTC m=+2002.395097213" lastFinishedPulling="2026-02-19 19:51:43.594105089 +0000 UTC m=+2003.206455413" observedRunningTime="2026-02-19 19:51:44.910219035 +0000 UTC m=+2004.522569399" watchObservedRunningTime="2026-02-19 19:51:44.914111557 +0000 UTC m=+2004.526461881" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.936459 4722 scope.go:117] "RemoveContainer" containerID="b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.941548 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p59r9"] Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.951561 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p59r9"] Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.978892 4722 scope.go:117] "RemoveContainer" containerID="5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2" Feb 19 19:51:44 crc kubenswrapper[4722]: E0219 19:51:44.979325 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2\": container with ID starting with 5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2 not found: ID does not exist" containerID="5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.979364 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2"} err="failed to get container status \"5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2\": rpc error: code = NotFound desc = could not find container \"5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2\": container with ID starting with 5610e76a13d8eb2abda855e27d31223636f9721cc64493b58192a4b7aab5d9e2 not found: ID does not exist" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.979392 4722 scope.go:117] "RemoveContainer" containerID="b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d" Feb 19 19:51:44 crc kubenswrapper[4722]: E0219 19:51:44.979831 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d\": container with ID starting with b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d not found: ID does not exist" containerID="b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.979880 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d"} err="failed to get container status \"b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d\": rpc error: code = NotFound desc = could not find container \"b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d\": container with ID starting with b183e57fc8b9831e61b7316a96f9c0d086b2217a3e0b5872b56a64d22b29b86d not found: ID does not exist" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.979912 4722 scope.go:117] "RemoveContainer" containerID="b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4" Feb 19 19:51:44 crc kubenswrapper[4722]: E0219 19:51:44.980202 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4\": container with ID starting with b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4 not found: ID does not exist" containerID="b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4" Feb 19 19:51:44 crc kubenswrapper[4722]: I0219 19:51:44.980234 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4"} err="failed to get container status \"b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4\": rpc error: code = NotFound desc = could not find container \"b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4\": container with ID starting with b3baeed98ad3026543608486edfafe68a4f80a3a01617b285c6384e3222d1ee4 not found: ID does not exist" Feb 19 19:51:45 crc kubenswrapper[4722]: I0219 19:51:45.083833 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2244865-e076-45b5-9bd9-d639d96d6ffe" path="/var/lib/kubelet/pods/b2244865-e076-45b5-9bd9-d639d96d6ffe/volumes" Feb 19 19:51:45 crc kubenswrapper[4722]: I0219 19:51:45.666683 4722 scope.go:117] "RemoveContainer" containerID="60ef90f5731ef12dc8b60fe6497ea601e5709b737be4d080a2debe1569284fd1" Feb 19 19:51:45 crc kubenswrapper[4722]: I0219 19:51:45.708976 4722 scope.go:117] "RemoveContainer" containerID="b21bbae7a8949776700019b16cbaccbd427e9d7db723a1e91246a4178885c340" Feb 19 19:51:48 crc kubenswrapper[4722]: I0219 19:51:48.057061 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-2lhsl"] Feb 19 19:51:48 crc kubenswrapper[4722]: I0219 19:51:48.070311 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-2lhsl"] Feb 19 19:51:49 crc kubenswrapper[4722]: I0219 19:51:49.083648 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8" path="/var/lib/kubelet/pods/ab494dba-cc02-4e34-b4cc-0c5bd3fc7ee8/volumes" Feb 19 19:51:50 crc kubenswrapper[4722]: I0219 19:51:50.947388 4722 generic.go:334] "Generic (PLEG): container finished" podID="44ab5cbe-e4cd-4036-8768-104fcf0d8963" containerID="4a87c6050a72a549c80668d7d9b519552efbd08d93bc7244a2c305d766c13317" exitCode=0 Feb 19 19:51:50 crc kubenswrapper[4722]: I0219 19:51:50.947523 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" event={"ID":"44ab5cbe-e4cd-4036-8768-104fcf0d8963","Type":"ContainerDied","Data":"4a87c6050a72a549c80668d7d9b519552efbd08d93bc7244a2c305d766c13317"} Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.429271 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.566701 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwq8z\" (UniqueName: \"kubernetes.io/projected/44ab5cbe-e4cd-4036-8768-104fcf0d8963-kube-api-access-fwq8z\") pod \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.566831 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-ssh-key-openstack-edpm-ipam\") pod \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.566854 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-inventory\") pod \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\" (UID: \"44ab5cbe-e4cd-4036-8768-104fcf0d8963\") " Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.572942 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44ab5cbe-e4cd-4036-8768-104fcf0d8963-kube-api-access-fwq8z" (OuterVolumeSpecName: "kube-api-access-fwq8z") pod "44ab5cbe-e4cd-4036-8768-104fcf0d8963" (UID: "44ab5cbe-e4cd-4036-8768-104fcf0d8963"). InnerVolumeSpecName "kube-api-access-fwq8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.595658 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-inventory" (OuterVolumeSpecName: "inventory") pod "44ab5cbe-e4cd-4036-8768-104fcf0d8963" (UID: "44ab5cbe-e4cd-4036-8768-104fcf0d8963"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.597529 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "44ab5cbe-e4cd-4036-8768-104fcf0d8963" (UID: "44ab5cbe-e4cd-4036-8768-104fcf0d8963"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.669648 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwq8z\" (UniqueName: \"kubernetes.io/projected/44ab5cbe-e4cd-4036-8768-104fcf0d8963-kube-api-access-fwq8z\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.669688 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.669704 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44ab5cbe-e4cd-4036-8768-104fcf0d8963-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.968781 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" event={"ID":"44ab5cbe-e4cd-4036-8768-104fcf0d8963","Type":"ContainerDied","Data":"48509c603cbc50d92636f3b8fd5508c2be236b4b60cc1033e3fe5f5f28886ca6"} Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.968834 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48509c603cbc50d92636f3b8fd5508c2be236b4b60cc1033e3fe5f5f28886ca6" Feb 19 19:51:52 crc kubenswrapper[4722]: I0219 19:51:52.969134 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-r6c9k" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.044614 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6"] Feb 19 19:51:53 crc kubenswrapper[4722]: E0219 19:51:53.045820 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerName="extract-content" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.045911 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerName="extract-content" Feb 19 19:51:53 crc kubenswrapper[4722]: E0219 19:51:53.045967 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44ab5cbe-e4cd-4036-8768-104fcf0d8963" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.046018 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="44ab5cbe-e4cd-4036-8768-104fcf0d8963" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:51:53 crc kubenswrapper[4722]: E0219 19:51:53.046100 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerName="extract-utilities" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.046217 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerName="extract-utilities" Feb 19 19:51:53 crc kubenswrapper[4722]: E0219 19:51:53.046292 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerName="registry-server" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.046349 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerName="registry-server" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.046579 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="44ab5cbe-e4cd-4036-8768-104fcf0d8963" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.046650 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2244865-e076-45b5-9bd9-d639d96d6ffe" containerName="registry-server" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.047505 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.053386 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6"] Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.089801 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.090031 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.090186 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.090345 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.188783 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.188860 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.188922 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m79fx\" (UniqueName: \"kubernetes.io/projected/baff33d3-a587-4283-a861-38d88a47539e-kube-api-access-m79fx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.291642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.291728 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.291797 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m79fx\" (UniqueName: \"kubernetes.io/projected/baff33d3-a587-4283-a861-38d88a47539e-kube-api-access-m79fx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.295917 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.298998 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.309365 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m79fx\" (UniqueName: \"kubernetes.io/projected/baff33d3-a587-4283-a861-38d88a47539e-kube-api-access-m79fx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.423560 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.986576 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6"] Feb 19 19:51:53 crc kubenswrapper[4722]: I0219 19:51:53.989673 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:51:54 crc kubenswrapper[4722]: I0219 19:51:54.993261 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" event={"ID":"baff33d3-a587-4283-a861-38d88a47539e","Type":"ContainerStarted","Data":"143b81501dc786d37c3fad45b6cc39d9b601b99871fd6c5f89b351f716bba996"} Feb 19 19:51:54 crc kubenswrapper[4722]: I0219 19:51:54.993792 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" event={"ID":"baff33d3-a587-4283-a861-38d88a47539e","Type":"ContainerStarted","Data":"3b4cf662cf6ba5a00d111cabca8e0756fd25592ca41c1262d13ee098f3a88482"} Feb 19 19:52:04 crc kubenswrapper[4722]: I0219 19:52:04.090201 4722 generic.go:334] "Generic (PLEG): container finished" podID="baff33d3-a587-4283-a861-38d88a47539e" containerID="143b81501dc786d37c3fad45b6cc39d9b601b99871fd6c5f89b351f716bba996" exitCode=0 Feb 19 19:52:04 crc kubenswrapper[4722]: I0219 19:52:04.090458 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" event={"ID":"baff33d3-a587-4283-a861-38d88a47539e","Type":"ContainerDied","Data":"143b81501dc786d37c3fad45b6cc39d9b601b99871fd6c5f89b351f716bba996"} Feb 19 19:52:05 crc kubenswrapper[4722]: I0219 19:52:05.569069 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:52:05 crc kubenswrapper[4722]: I0219 19:52:05.660834 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-ssh-key-openstack-edpm-ipam\") pod \"baff33d3-a587-4283-a861-38d88a47539e\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " Feb 19 19:52:05 crc kubenswrapper[4722]: I0219 19:52:05.660933 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-inventory\") pod \"baff33d3-a587-4283-a861-38d88a47539e\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " Feb 19 19:52:05 crc kubenswrapper[4722]: I0219 19:52:05.661201 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m79fx\" (UniqueName: \"kubernetes.io/projected/baff33d3-a587-4283-a861-38d88a47539e-kube-api-access-m79fx\") pod \"baff33d3-a587-4283-a861-38d88a47539e\" (UID: \"baff33d3-a587-4283-a861-38d88a47539e\") " Feb 19 19:52:05 crc kubenswrapper[4722]: I0219 19:52:05.669102 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baff33d3-a587-4283-a861-38d88a47539e-kube-api-access-m79fx" (OuterVolumeSpecName: "kube-api-access-m79fx") pod "baff33d3-a587-4283-a861-38d88a47539e" (UID: "baff33d3-a587-4283-a861-38d88a47539e"). InnerVolumeSpecName "kube-api-access-m79fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:52:05 crc kubenswrapper[4722]: I0219 19:52:05.692607 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-inventory" (OuterVolumeSpecName: "inventory") pod "baff33d3-a587-4283-a861-38d88a47539e" (UID: "baff33d3-a587-4283-a861-38d88a47539e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:05 crc kubenswrapper[4722]: I0219 19:52:05.698326 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "baff33d3-a587-4283-a861-38d88a47539e" (UID: "baff33d3-a587-4283-a861-38d88a47539e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:05 crc kubenswrapper[4722]: I0219 19:52:05.763079 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m79fx\" (UniqueName: \"kubernetes.io/projected/baff33d3-a587-4283-a861-38d88a47539e-kube-api-access-m79fx\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:05 crc kubenswrapper[4722]: I0219 19:52:05.763118 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:05 crc kubenswrapper[4722]: I0219 19:52:05.763157 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baff33d3-a587-4283-a861-38d88a47539e-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.114423 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" event={"ID":"baff33d3-a587-4283-a861-38d88a47539e","Type":"ContainerDied","Data":"3b4cf662cf6ba5a00d111cabca8e0756fd25592ca41c1262d13ee098f3a88482"} Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.114460 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b4cf662cf6ba5a00d111cabca8e0756fd25592ca41c1262d13ee098f3a88482" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.114485 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.241935 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6"] Feb 19 19:52:06 crc kubenswrapper[4722]: E0219 19:52:06.242441 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baff33d3-a587-4283-a861-38d88a47539e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.242459 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="baff33d3-a587-4283-a861-38d88a47539e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.242697 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="baff33d3-a587-4283-a861-38d88a47539e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.243507 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.255138 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6"] Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.301241 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.301512 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.301693 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.301777 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.301924 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.301952 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.302189 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.302439 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.405342 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.405424 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.405457 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.405503 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.405660 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.405857 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.405943 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.406131 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.406197 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.406222 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.406302 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.406343 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x8x2\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-kube-api-access-8x8x2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.406454 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.406536 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.508705 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.508799 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.508856 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.508880 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.508902 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.508928 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.508954 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x8x2\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-kube-api-access-8x8x2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.508997 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.509040 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.509115 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.509138 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.509362 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.510662 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.510731 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.514848 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.514887 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.515041 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.515600 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.516212 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.516548 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.516829 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.517241 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.517909 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.527040 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.527957 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.530330 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.531989 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.535433 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x8x2\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-kube-api-access-8x8x2\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:06 crc kubenswrapper[4722]: I0219 19:52:06.619267 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:07 crc kubenswrapper[4722]: I0219 19:52:07.149574 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6"] Feb 19 19:52:08 crc kubenswrapper[4722]: I0219 19:52:08.138369 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" event={"ID":"0a95e206-d7b9-49a5-8efd-7cab72e48d9d","Type":"ContainerStarted","Data":"321fbe59a82c15c45c65820fb8c050c7d091a8fa12ab01541eab805404a4639d"} Feb 19 19:52:08 crc kubenswrapper[4722]: I0219 19:52:08.138951 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" event={"ID":"0a95e206-d7b9-49a5-8efd-7cab72e48d9d","Type":"ContainerStarted","Data":"dc0934b6dce273124dc8075661c982c60a4cfbee12a1f48ca3c5fd27bd9ba327"} Feb 19 19:52:08 crc kubenswrapper[4722]: I0219 19:52:08.165486 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" podStartSLOduration=1.7751663039999999 podStartE2EDuration="2.165465618s" podCreationTimestamp="2026-02-19 19:52:06 +0000 UTC" firstStartedPulling="2026-02-19 19:52:07.163298312 +0000 UTC m=+2026.775648636" lastFinishedPulling="2026-02-19 19:52:07.553597626 +0000 UTC m=+2027.165947950" observedRunningTime="2026-02-19 19:52:08.157446308 +0000 UTC m=+2027.769796642" watchObservedRunningTime="2026-02-19 19:52:08.165465618 +0000 UTC m=+2027.777815942" Feb 19 19:52:33 crc kubenswrapper[4722]: I0219 19:52:33.473139 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-rwjf7"] Feb 19 19:52:33 crc kubenswrapper[4722]: I0219 19:52:33.481084 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-rwjf7"] Feb 19 19:52:35 crc kubenswrapper[4722]: I0219 19:52:35.089269 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eb4da3f-b07b-4b6f-a524-8b2af229ed87" path="/var/lib/kubelet/pods/3eb4da3f-b07b-4b6f-a524-8b2af229ed87/volumes" Feb 19 19:52:41 crc kubenswrapper[4722]: I0219 19:52:41.036017 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-77rmn"] Feb 19 19:52:41 crc kubenswrapper[4722]: I0219 19:52:41.046465 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-77rmn"] Feb 19 19:52:41 crc kubenswrapper[4722]: I0219 19:52:41.091581 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04e19f64-06d2-4c0e-b33c-000fea5deb27" path="/var/lib/kubelet/pods/04e19f64-06d2-4c0e-b33c-000fea5deb27/volumes" Feb 19 19:52:41 crc kubenswrapper[4722]: I0219 19:52:41.499346 4722 generic.go:334] "Generic (PLEG): container finished" podID="0a95e206-d7b9-49a5-8efd-7cab72e48d9d" containerID="321fbe59a82c15c45c65820fb8c050c7d091a8fa12ab01541eab805404a4639d" exitCode=0 Feb 19 19:52:41 crc kubenswrapper[4722]: I0219 19:52:41.499387 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" event={"ID":"0a95e206-d7b9-49a5-8efd-7cab72e48d9d","Type":"ContainerDied","Data":"321fbe59a82c15c45c65820fb8c050c7d091a8fa12ab01541eab805404a4639d"} Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.010709 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025602 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-bootstrap-combined-ca-bundle\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025635 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-libvirt-combined-ca-bundle\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025674 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-repo-setup-combined-ca-bundle\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025707 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ssh-key-openstack-edpm-ipam\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025759 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025812 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025837 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ovn-combined-ca-bundle\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025871 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-nova-combined-ca-bundle\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025904 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x8x2\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-kube-api-access-8x8x2\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025940 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-telemetry-combined-ca-bundle\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.025987 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-neutron-metadata-combined-ca-bundle\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.026012 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-ovn-default-certs-0\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.026049 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-inventory\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.026071 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\" (UID: \"0a95e206-d7b9-49a5-8efd-7cab72e48d9d\") " Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.033246 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.033318 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.033417 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-kube-api-access-8x8x2" (OuterVolumeSpecName: "kube-api-access-8x8x2") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "kube-api-access-8x8x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.034341 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.038408 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.039092 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.039621 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.039627 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.039959 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.040285 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.041050 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.046265 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.074706 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.094188 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-inventory" (OuterVolumeSpecName: "inventory") pod "0a95e206-d7b9-49a5-8efd-7cab72e48d9d" (UID: "0a95e206-d7b9-49a5-8efd-7cab72e48d9d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129007 4722 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129039 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x8x2\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-kube-api-access-8x8x2\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129049 4722 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129059 4722 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129069 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129079 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129088 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129098 4722 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129111 4722 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129123 4722 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129135 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129167 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129181 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.129194 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a95e206-d7b9-49a5-8efd-7cab72e48d9d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.546395 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" event={"ID":"0a95e206-d7b9-49a5-8efd-7cab72e48d9d","Type":"ContainerDied","Data":"dc0934b6dce273124dc8075661c982c60a4cfbee12a1f48ca3c5fd27bd9ba327"} Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.546437 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc0934b6dce273124dc8075661c982c60a4cfbee12a1f48ca3c5fd27bd9ba327" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.546559 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.619899 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89"] Feb 19 19:52:43 crc kubenswrapper[4722]: E0219 19:52:43.620496 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a95e206-d7b9-49a5-8efd-7cab72e48d9d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.620525 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a95e206-d7b9-49a5-8efd-7cab72e48d9d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.620853 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a95e206-d7b9-49a5-8efd-7cab72e48d9d" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.621837 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.625446 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.625748 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.625913 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.626143 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.627424 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.636203 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89"] Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.643775 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.643818 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.643879 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.643982 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9h49\" (UniqueName: \"kubernetes.io/projected/9ff9829f-e8f9-4d78-9826-0385817cf2a4-kube-api-access-h9h49\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.644061 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.746435 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.746507 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.746591 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.746726 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9h49\" (UniqueName: \"kubernetes.io/projected/9ff9829f-e8f9-4d78-9826-0385817cf2a4-kube-api-access-h9h49\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.746825 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.747666 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.750300 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.752580 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.757755 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.767099 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9h49\" (UniqueName: \"kubernetes.io/projected/9ff9829f-e8f9-4d78-9826-0385817cf2a4-kube-api-access-h9h49\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-v5f89\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:43 crc kubenswrapper[4722]: I0219 19:52:43.946659 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:52:44 crc kubenswrapper[4722]: I0219 19:52:44.491370 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89"] Feb 19 19:52:44 crc kubenswrapper[4722]: I0219 19:52:44.571980 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" event={"ID":"9ff9829f-e8f9-4d78-9826-0385817cf2a4","Type":"ContainerStarted","Data":"751f892a57debc49c4d931416540d4a8f9ab6159fbeea6888e820b630b8b4812"} Feb 19 19:52:45 crc kubenswrapper[4722]: I0219 19:52:45.582073 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" event={"ID":"9ff9829f-e8f9-4d78-9826-0385817cf2a4","Type":"ContainerStarted","Data":"456f8778f9036ba7b994c1e10b0831dd76f418dfe38b1485979e5bcdaf170ca1"} Feb 19 19:52:45 crc kubenswrapper[4722]: I0219 19:52:45.603014 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" podStartSLOduration=2.151882266 podStartE2EDuration="2.602992059s" podCreationTimestamp="2026-02-19 19:52:43 +0000 UTC" firstStartedPulling="2026-02-19 19:52:44.499611043 +0000 UTC m=+2064.111961367" lastFinishedPulling="2026-02-19 19:52:44.950720836 +0000 UTC m=+2064.563071160" observedRunningTime="2026-02-19 19:52:45.59853863 +0000 UTC m=+2065.210888974" watchObservedRunningTime="2026-02-19 19:52:45.602992059 +0000 UTC m=+2065.215342383" Feb 19 19:52:45 crc kubenswrapper[4722]: I0219 19:52:45.814379 4722 scope.go:117] "RemoveContainer" containerID="b01a64e732c528d91886fbc6303b22a5ddc8e59f3b32f31c6e2bfee4be333b08" Feb 19 19:52:45 crc kubenswrapper[4722]: I0219 19:52:45.848682 4722 scope.go:117] "RemoveContainer" containerID="6218fa1a82ddfb695bd10c74992d9f549d06629abf7b79a733758e50532f43fb" Feb 19 19:52:45 crc kubenswrapper[4722]: I0219 19:52:45.894072 4722 scope.go:117] "RemoveContainer" containerID="1b22b481f4ab7fb0f4e181aec8382fbfd29168cb889f08d4a7d81841adae3d63" Feb 19 19:53:41 crc kubenswrapper[4722]: I0219 19:53:41.798785 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:53:41 crc kubenswrapper[4722]: I0219 19:53:41.800298 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:53:42 crc kubenswrapper[4722]: I0219 19:53:42.092323 4722 generic.go:334] "Generic (PLEG): container finished" podID="9ff9829f-e8f9-4d78-9826-0385817cf2a4" containerID="456f8778f9036ba7b994c1e10b0831dd76f418dfe38b1485979e5bcdaf170ca1" exitCode=0 Feb 19 19:53:42 crc kubenswrapper[4722]: I0219 19:53:42.092369 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" event={"ID":"9ff9829f-e8f9-4d78-9826-0385817cf2a4","Type":"ContainerDied","Data":"456f8778f9036ba7b994c1e10b0831dd76f418dfe38b1485979e5bcdaf170ca1"} Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.622408 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.755570 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-inventory\") pod \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.755638 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovncontroller-config-0\") pod \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.755675 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9h49\" (UniqueName: \"kubernetes.io/projected/9ff9829f-e8f9-4d78-9826-0385817cf2a4-kube-api-access-h9h49\") pod \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.755700 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovn-combined-ca-bundle\") pod \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.755964 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ssh-key-openstack-edpm-ipam\") pod \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\" (UID: \"9ff9829f-e8f9-4d78-9826-0385817cf2a4\") " Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.769579 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9ff9829f-e8f9-4d78-9826-0385817cf2a4" (UID: "9ff9829f-e8f9-4d78-9826-0385817cf2a4"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.778396 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff9829f-e8f9-4d78-9826-0385817cf2a4-kube-api-access-h9h49" (OuterVolumeSpecName: "kube-api-access-h9h49") pod "9ff9829f-e8f9-4d78-9826-0385817cf2a4" (UID: "9ff9829f-e8f9-4d78-9826-0385817cf2a4"). InnerVolumeSpecName "kube-api-access-h9h49". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.804780 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "9ff9829f-e8f9-4d78-9826-0385817cf2a4" (UID: "9ff9829f-e8f9-4d78-9826-0385817cf2a4"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.843346 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-inventory" (OuterVolumeSpecName: "inventory") pod "9ff9829f-e8f9-4d78-9826-0385817cf2a4" (UID: "9ff9829f-e8f9-4d78-9826-0385817cf2a4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.864662 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.864694 4722 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.864705 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9h49\" (UniqueName: \"kubernetes.io/projected/9ff9829f-e8f9-4d78-9826-0385817cf2a4-kube-api-access-h9h49\") on node \"crc\" DevicePath \"\"" Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.864714 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.899321 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9ff9829f-e8f9-4d78-9826-0385817cf2a4" (UID: "9ff9829f-e8f9-4d78-9826-0385817cf2a4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:53:43 crc kubenswrapper[4722]: I0219 19:53:43.966013 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9ff9829f-e8f9-4d78-9826-0385817cf2a4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.110191 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" event={"ID":"9ff9829f-e8f9-4d78-9826-0385817cf2a4","Type":"ContainerDied","Data":"751f892a57debc49c4d931416540d4a8f9ab6159fbeea6888e820b630b8b4812"} Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.110231 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="751f892a57debc49c4d931416540d4a8f9ab6159fbeea6888e820b630b8b4812" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.110281 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-v5f89" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.215210 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf"] Feb 19 19:53:44 crc kubenswrapper[4722]: E0219 19:53:44.216076 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ff9829f-e8f9-4d78-9826-0385817cf2a4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.216104 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff9829f-e8f9-4d78-9826-0385817cf2a4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.216392 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ff9829f-e8f9-4d78-9826-0385817cf2a4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.217396 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.225676 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.225924 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.225757 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.226236 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.226416 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.226426 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.230826 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf"] Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.375502 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.375559 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.375625 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.375696 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.375720 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxcnf\" (UniqueName: \"kubernetes.io/projected/ee896205-7724-47fe-9f87-f2efb9afa870-kube-api-access-kxcnf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.375798 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.477612 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.478599 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxcnf\" (UniqueName: \"kubernetes.io/projected/ee896205-7724-47fe-9f87-f2efb9afa870-kube-api-access-kxcnf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.478709 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.478878 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.478902 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.478947 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.482261 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.482664 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.483069 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.483265 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.483391 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.496406 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxcnf\" (UniqueName: \"kubernetes.io/projected/ee896205-7724-47fe-9f87-f2efb9afa870-kube-api-access-kxcnf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:44 crc kubenswrapper[4722]: I0219 19:53:44.538231 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:53:45 crc kubenswrapper[4722]: I0219 19:53:45.119566 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf"] Feb 19 19:53:46 crc kubenswrapper[4722]: I0219 19:53:46.135525 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" event={"ID":"ee896205-7724-47fe-9f87-f2efb9afa870","Type":"ContainerStarted","Data":"42128f3069e7caf833eb9af4ff7adba03d900487ba12c81ba237fa92fcdff17c"} Feb 19 19:53:46 crc kubenswrapper[4722]: I0219 19:53:46.136095 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" event={"ID":"ee896205-7724-47fe-9f87-f2efb9afa870","Type":"ContainerStarted","Data":"1e8b756a37868b59c74f59e4ae72a0d353d8102de72325a1db36c98c4ee3665a"} Feb 19 19:54:11 crc kubenswrapper[4722]: I0219 19:54:11.798633 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:54:11 crc kubenswrapper[4722]: I0219 19:54:11.799396 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:54:29 crc kubenswrapper[4722]: I0219 19:54:29.497108 4722 generic.go:334] "Generic (PLEG): container finished" podID="ee896205-7724-47fe-9f87-f2efb9afa870" containerID="42128f3069e7caf833eb9af4ff7adba03d900487ba12c81ba237fa92fcdff17c" exitCode=0 Feb 19 19:54:29 crc kubenswrapper[4722]: I0219 19:54:29.497275 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" event={"ID":"ee896205-7724-47fe-9f87-f2efb9afa870","Type":"ContainerDied","Data":"42128f3069e7caf833eb9af4ff7adba03d900487ba12c81ba237fa92fcdff17c"} Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.110945 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.236615 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-metadata-combined-ca-bundle\") pod \"ee896205-7724-47fe-9f87-f2efb9afa870\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.237201 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxcnf\" (UniqueName: \"kubernetes.io/projected/ee896205-7724-47fe-9f87-f2efb9afa870-kube-api-access-kxcnf\") pod \"ee896205-7724-47fe-9f87-f2efb9afa870\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.237308 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-ovn-metadata-agent-neutron-config-0\") pod \"ee896205-7724-47fe-9f87-f2efb9afa870\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.237531 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-nova-metadata-neutron-config-0\") pod \"ee896205-7724-47fe-9f87-f2efb9afa870\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.237693 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-inventory\") pod \"ee896205-7724-47fe-9f87-f2efb9afa870\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.237784 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-ssh-key-openstack-edpm-ipam\") pod \"ee896205-7724-47fe-9f87-f2efb9afa870\" (UID: \"ee896205-7724-47fe-9f87-f2efb9afa870\") " Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.245292 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee896205-7724-47fe-9f87-f2efb9afa870-kube-api-access-kxcnf" (OuterVolumeSpecName: "kube-api-access-kxcnf") pod "ee896205-7724-47fe-9f87-f2efb9afa870" (UID: "ee896205-7724-47fe-9f87-f2efb9afa870"). InnerVolumeSpecName "kube-api-access-kxcnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.256345 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "ee896205-7724-47fe-9f87-f2efb9afa870" (UID: "ee896205-7724-47fe-9f87-f2efb9afa870"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.271176 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "ee896205-7724-47fe-9f87-f2efb9afa870" (UID: "ee896205-7724-47fe-9f87-f2efb9afa870"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.273369 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ee896205-7724-47fe-9f87-f2efb9afa870" (UID: "ee896205-7724-47fe-9f87-f2efb9afa870"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.273398 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "ee896205-7724-47fe-9f87-f2efb9afa870" (UID: "ee896205-7724-47fe-9f87-f2efb9afa870"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.273788 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-inventory" (OuterVolumeSpecName: "inventory") pod "ee896205-7724-47fe-9f87-f2efb9afa870" (UID: "ee896205-7724-47fe-9f87-f2efb9afa870"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.340798 4722 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.340831 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxcnf\" (UniqueName: \"kubernetes.io/projected/ee896205-7724-47fe-9f87-f2efb9afa870-kube-api-access-kxcnf\") on node \"crc\" DevicePath \"\"" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.340844 4722 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.340855 4722 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.340867 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.340876 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee896205-7724-47fe-9f87-f2efb9afa870-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.520001 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" event={"ID":"ee896205-7724-47fe-9f87-f2efb9afa870","Type":"ContainerDied","Data":"1e8b756a37868b59c74f59e4ae72a0d353d8102de72325a1db36c98c4ee3665a"} Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.520040 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e8b756a37868b59c74f59e4ae72a0d353d8102de72325a1db36c98c4ee3665a" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.520134 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.622070 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2"] Feb 19 19:54:31 crc kubenswrapper[4722]: E0219 19:54:31.622763 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee896205-7724-47fe-9f87-f2efb9afa870" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.622841 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee896205-7724-47fe-9f87-f2efb9afa870" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.623249 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee896205-7724-47fe-9f87-f2efb9afa870" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.624662 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.630119 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.630213 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.630588 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.630729 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.630926 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.632388 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2"] Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.748369 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.748527 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.748554 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.748832 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjpjx\" (UniqueName: \"kubernetes.io/projected/a0d75723-6d9a-4609-a294-f179d1e84710-kube-api-access-jjpjx\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.748908 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.850656 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.850707 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.850840 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjpjx\" (UniqueName: \"kubernetes.io/projected/a0d75723-6d9a-4609-a294-f179d1e84710-kube-api-access-jjpjx\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.850880 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.850965 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.854794 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.855323 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.856482 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.857600 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.868757 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjpjx\" (UniqueName: \"kubernetes.io/projected/a0d75723-6d9a-4609-a294-f179d1e84710-kube-api-access-jjpjx\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:31 crc kubenswrapper[4722]: I0219 19:54:31.946254 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:54:32 crc kubenswrapper[4722]: I0219 19:54:32.459123 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2"] Feb 19 19:54:32 crc kubenswrapper[4722]: I0219 19:54:32.529774 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" event={"ID":"a0d75723-6d9a-4609-a294-f179d1e84710","Type":"ContainerStarted","Data":"17f45e35533bb8a07ee9122a5653857a6db08cf7018f843e7d190f9e046c6b5c"} Feb 19 19:54:33 crc kubenswrapper[4722]: I0219 19:54:33.539852 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" event={"ID":"a0d75723-6d9a-4609-a294-f179d1e84710","Type":"ContainerStarted","Data":"6e06dc95162c0d51603edd10c6b9f7656cb9b02520ae430117bbe54d6a6625f4"} Feb 19 19:54:33 crc kubenswrapper[4722]: I0219 19:54:33.561937 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" podStartSLOduration=2.167612419 podStartE2EDuration="2.561920541s" podCreationTimestamp="2026-02-19 19:54:31 +0000 UTC" firstStartedPulling="2026-02-19 19:54:32.465602093 +0000 UTC m=+2172.077952417" lastFinishedPulling="2026-02-19 19:54:32.859910215 +0000 UTC m=+2172.472260539" observedRunningTime="2026-02-19 19:54:33.553496388 +0000 UTC m=+2173.165846732" watchObservedRunningTime="2026-02-19 19:54:33.561920541 +0000 UTC m=+2173.174270865" Feb 19 19:54:41 crc kubenswrapper[4722]: I0219 19:54:41.798268 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:54:41 crc kubenswrapper[4722]: I0219 19:54:41.798847 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:54:41 crc kubenswrapper[4722]: I0219 19:54:41.798891 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:54:41 crc kubenswrapper[4722]: I0219 19:54:41.799720 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2cd16aaeb87a475b93fc788beab87d96c07079ee8a02a2b8bfaa32d70b168fef"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:54:41 crc kubenswrapper[4722]: I0219 19:54:41.799774 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://2cd16aaeb87a475b93fc788beab87d96c07079ee8a02a2b8bfaa32d70b168fef" gracePeriod=600 Feb 19 19:54:42 crc kubenswrapper[4722]: I0219 19:54:42.626389 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="2cd16aaeb87a475b93fc788beab87d96c07079ee8a02a2b8bfaa32d70b168fef" exitCode=0 Feb 19 19:54:42 crc kubenswrapper[4722]: I0219 19:54:42.626466 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"2cd16aaeb87a475b93fc788beab87d96c07079ee8a02a2b8bfaa32d70b168fef"} Feb 19 19:54:42 crc kubenswrapper[4722]: I0219 19:54:42.626694 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209"} Feb 19 19:54:42 crc kubenswrapper[4722]: I0219 19:54:42.626715 4722 scope.go:117] "RemoveContainer" containerID="38e8991442f67bf67b1efe077d8883949da08ed6d43a8a72df99dae5eb3100bc" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.267742 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pmrr9"] Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.270901 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.287806 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pmrr9"] Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.403591 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-catalog-content\") pod \"community-operators-pmrr9\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.403727 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-utilities\") pod \"community-operators-pmrr9\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.403757 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6kxm\" (UniqueName: \"kubernetes.io/projected/7b93b205-cb18-4cb4-810c-17775c15279e-kube-api-access-l6kxm\") pod \"community-operators-pmrr9\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.505630 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-catalog-content\") pod \"community-operators-pmrr9\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.505773 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-utilities\") pod \"community-operators-pmrr9\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.505800 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6kxm\" (UniqueName: \"kubernetes.io/projected/7b93b205-cb18-4cb4-810c-17775c15279e-kube-api-access-l6kxm\") pod \"community-operators-pmrr9\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.506229 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-catalog-content\") pod \"community-operators-pmrr9\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.506352 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-utilities\") pod \"community-operators-pmrr9\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.534094 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6kxm\" (UniqueName: \"kubernetes.io/projected/7b93b205-cb18-4cb4-810c-17775c15279e-kube-api-access-l6kxm\") pod \"community-operators-pmrr9\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:24 crc kubenswrapper[4722]: I0219 19:55:24.594526 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:25 crc kubenswrapper[4722]: I0219 19:55:25.150488 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pmrr9"] Feb 19 19:55:26 crc kubenswrapper[4722]: I0219 19:55:26.016880 4722 generic.go:334] "Generic (PLEG): container finished" podID="7b93b205-cb18-4cb4-810c-17775c15279e" containerID="26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6" exitCode=0 Feb 19 19:55:26 crc kubenswrapper[4722]: I0219 19:55:26.016935 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmrr9" event={"ID":"7b93b205-cb18-4cb4-810c-17775c15279e","Type":"ContainerDied","Data":"26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6"} Feb 19 19:55:26 crc kubenswrapper[4722]: I0219 19:55:26.017241 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmrr9" event={"ID":"7b93b205-cb18-4cb4-810c-17775c15279e","Type":"ContainerStarted","Data":"7fd6a711f28aeb9304a059eacf02cde35648f02d7504e83c7695fce054b0c1b6"} Feb 19 19:55:28 crc kubenswrapper[4722]: I0219 19:55:28.036408 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmrr9" event={"ID":"7b93b205-cb18-4cb4-810c-17775c15279e","Type":"ContainerStarted","Data":"36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c"} Feb 19 19:55:30 crc kubenswrapper[4722]: I0219 19:55:30.056861 4722 generic.go:334] "Generic (PLEG): container finished" podID="7b93b205-cb18-4cb4-810c-17775c15279e" containerID="36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c" exitCode=0 Feb 19 19:55:30 crc kubenswrapper[4722]: I0219 19:55:30.056943 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmrr9" event={"ID":"7b93b205-cb18-4cb4-810c-17775c15279e","Type":"ContainerDied","Data":"36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c"} Feb 19 19:55:31 crc kubenswrapper[4722]: I0219 19:55:31.083072 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmrr9" event={"ID":"7b93b205-cb18-4cb4-810c-17775c15279e","Type":"ContainerStarted","Data":"a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf"} Feb 19 19:55:31 crc kubenswrapper[4722]: I0219 19:55:31.102781 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pmrr9" podStartSLOduration=2.69652731 podStartE2EDuration="7.102764957s" podCreationTimestamp="2026-02-19 19:55:24 +0000 UTC" firstStartedPulling="2026-02-19 19:55:26.020446561 +0000 UTC m=+2225.632796895" lastFinishedPulling="2026-02-19 19:55:30.426684218 +0000 UTC m=+2230.039034542" observedRunningTime="2026-02-19 19:55:31.099730732 +0000 UTC m=+2230.712081056" watchObservedRunningTime="2026-02-19 19:55:31.102764957 +0000 UTC m=+2230.715115281" Feb 19 19:55:34 crc kubenswrapper[4722]: I0219 19:55:34.595344 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:34 crc kubenswrapper[4722]: I0219 19:55:34.597133 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:34 crc kubenswrapper[4722]: I0219 19:55:34.648742 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:35 crc kubenswrapper[4722]: I0219 19:55:35.161739 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:36 crc kubenswrapper[4722]: I0219 19:55:36.654715 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pmrr9"] Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.142346 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pmrr9" podUID="7b93b205-cb18-4cb4-810c-17775c15279e" containerName="registry-server" containerID="cri-o://a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf" gracePeriod=2 Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.687840 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.733005 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-catalog-content\") pod \"7b93b205-cb18-4cb4-810c-17775c15279e\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.786205 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b93b205-cb18-4cb4-810c-17775c15279e" (UID: "7b93b205-cb18-4cb4-810c-17775c15279e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.835668 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-utilities\") pod \"7b93b205-cb18-4cb4-810c-17775c15279e\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.836022 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6kxm\" (UniqueName: \"kubernetes.io/projected/7b93b205-cb18-4cb4-810c-17775c15279e-kube-api-access-l6kxm\") pod \"7b93b205-cb18-4cb4-810c-17775c15279e\" (UID: \"7b93b205-cb18-4cb4-810c-17775c15279e\") " Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.836512 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.836623 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-utilities" (OuterVolumeSpecName: "utilities") pod "7b93b205-cb18-4cb4-810c-17775c15279e" (UID: "7b93b205-cb18-4cb4-810c-17775c15279e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.840913 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b93b205-cb18-4cb4-810c-17775c15279e-kube-api-access-l6kxm" (OuterVolumeSpecName: "kube-api-access-l6kxm") pod "7b93b205-cb18-4cb4-810c-17775c15279e" (UID: "7b93b205-cb18-4cb4-810c-17775c15279e"). InnerVolumeSpecName "kube-api-access-l6kxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.939409 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6kxm\" (UniqueName: \"kubernetes.io/projected/7b93b205-cb18-4cb4-810c-17775c15279e-kube-api-access-l6kxm\") on node \"crc\" DevicePath \"\"" Feb 19 19:55:38 crc kubenswrapper[4722]: I0219 19:55:38.939689 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b93b205-cb18-4cb4-810c-17775c15279e-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.154417 4722 generic.go:334] "Generic (PLEG): container finished" podID="7b93b205-cb18-4cb4-810c-17775c15279e" containerID="a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf" exitCode=0 Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.154826 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmrr9" event={"ID":"7b93b205-cb18-4cb4-810c-17775c15279e","Type":"ContainerDied","Data":"a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf"} Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.154864 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pmrr9" event={"ID":"7b93b205-cb18-4cb4-810c-17775c15279e","Type":"ContainerDied","Data":"7fd6a711f28aeb9304a059eacf02cde35648f02d7504e83c7695fce054b0c1b6"} Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.154900 4722 scope.go:117] "RemoveContainer" containerID="a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf" Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.155073 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pmrr9" Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.183931 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pmrr9"] Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.184482 4722 scope.go:117] "RemoveContainer" containerID="36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c" Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.197284 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pmrr9"] Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.203285 4722 scope.go:117] "RemoveContainer" containerID="26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6" Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.263791 4722 scope.go:117] "RemoveContainer" containerID="a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf" Feb 19 19:55:39 crc kubenswrapper[4722]: E0219 19:55:39.264355 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf\": container with ID starting with a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf not found: ID does not exist" containerID="a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf" Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.264395 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf"} err="failed to get container status \"a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf\": rpc error: code = NotFound desc = could not find container \"a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf\": container with ID starting with a02bd41406d31904f6b73987db767c17e5221348805bec90cf294b0bcc0f36bf not found: ID does not exist" Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.264422 4722 scope.go:117] "RemoveContainer" containerID="36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c" Feb 19 19:55:39 crc kubenswrapper[4722]: E0219 19:55:39.264862 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c\": container with ID starting with 36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c not found: ID does not exist" containerID="36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c" Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.264884 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c"} err="failed to get container status \"36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c\": rpc error: code = NotFound desc = could not find container \"36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c\": container with ID starting with 36af8f6b718f14c99a6c7612e886b865f0dc7b91beaf5949b40b17093a71593c not found: ID does not exist" Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.264896 4722 scope.go:117] "RemoveContainer" containerID="26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6" Feb 19 19:55:39 crc kubenswrapper[4722]: E0219 19:55:39.265251 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6\": container with ID starting with 26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6 not found: ID does not exist" containerID="26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6" Feb 19 19:55:39 crc kubenswrapper[4722]: I0219 19:55:39.265301 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6"} err="failed to get container status \"26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6\": rpc error: code = NotFound desc = could not find container \"26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6\": container with ID starting with 26c3160c1ca66b1f21e41185ca604954483a1de8836521ec2390071be12995f6 not found: ID does not exist" Feb 19 19:55:41 crc kubenswrapper[4722]: I0219 19:55:41.084587 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b93b205-cb18-4cb4-810c-17775c15279e" path="/var/lib/kubelet/pods/7b93b205-cb18-4cb4-810c-17775c15279e/volumes" Feb 19 19:57:11 crc kubenswrapper[4722]: I0219 19:57:11.798467 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:57:11 crc kubenswrapper[4722]: I0219 19:57:11.799100 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:57:41 crc kubenswrapper[4722]: I0219 19:57:41.798670 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:57:41 crc kubenswrapper[4722]: I0219 19:57:41.799236 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:57:59 crc kubenswrapper[4722]: I0219 19:57:59.479502 4722 generic.go:334] "Generic (PLEG): container finished" podID="a0d75723-6d9a-4609-a294-f179d1e84710" containerID="6e06dc95162c0d51603edd10c6b9f7656cb9b02520ae430117bbe54d6a6625f4" exitCode=0 Feb 19 19:57:59 crc kubenswrapper[4722]: I0219 19:57:59.479626 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" event={"ID":"a0d75723-6d9a-4609-a294-f179d1e84710","Type":"ContainerDied","Data":"6e06dc95162c0d51603edd10c6b9f7656cb9b02520ae430117bbe54d6a6625f4"} Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.034190 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.206333 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-combined-ca-bundle\") pod \"a0d75723-6d9a-4609-a294-f179d1e84710\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.206758 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-ssh-key-openstack-edpm-ipam\") pod \"a0d75723-6d9a-4609-a294-f179d1e84710\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.206904 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-inventory\") pod \"a0d75723-6d9a-4609-a294-f179d1e84710\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.207044 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjpjx\" (UniqueName: \"kubernetes.io/projected/a0d75723-6d9a-4609-a294-f179d1e84710-kube-api-access-jjpjx\") pod \"a0d75723-6d9a-4609-a294-f179d1e84710\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.207299 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-secret-0\") pod \"a0d75723-6d9a-4609-a294-f179d1e84710\" (UID: \"a0d75723-6d9a-4609-a294-f179d1e84710\") " Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.220352 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a0d75723-6d9a-4609-a294-f179d1e84710" (UID: "a0d75723-6d9a-4609-a294-f179d1e84710"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.220480 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0d75723-6d9a-4609-a294-f179d1e84710-kube-api-access-jjpjx" (OuterVolumeSpecName: "kube-api-access-jjpjx") pod "a0d75723-6d9a-4609-a294-f179d1e84710" (UID: "a0d75723-6d9a-4609-a294-f179d1e84710"). InnerVolumeSpecName "kube-api-access-jjpjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.235669 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "a0d75723-6d9a-4609-a294-f179d1e84710" (UID: "a0d75723-6d9a-4609-a294-f179d1e84710"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.238248 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a0d75723-6d9a-4609-a294-f179d1e84710" (UID: "a0d75723-6d9a-4609-a294-f179d1e84710"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.239844 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-inventory" (OuterVolumeSpecName: "inventory") pod "a0d75723-6d9a-4609-a294-f179d1e84710" (UID: "a0d75723-6d9a-4609-a294-f179d1e84710"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.310656 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.310694 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.310703 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjpjx\" (UniqueName: \"kubernetes.io/projected/a0d75723-6d9a-4609-a294-f179d1e84710-kube-api-access-jjpjx\") on node \"crc\" DevicePath \"\"" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.310712 4722 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.310720 4722 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0d75723-6d9a-4609-a294-f179d1e84710-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.506223 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" event={"ID":"a0d75723-6d9a-4609-a294-f179d1e84710","Type":"ContainerDied","Data":"17f45e35533bb8a07ee9122a5653857a6db08cf7018f843e7d190f9e046c6b5c"} Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.506274 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17f45e35533bb8a07ee9122a5653857a6db08cf7018f843e7d190f9e046c6b5c" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.506321 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.620532 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2"] Feb 19 19:58:01 crc kubenswrapper[4722]: E0219 19:58:01.621002 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b93b205-cb18-4cb4-810c-17775c15279e" containerName="registry-server" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.621017 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b93b205-cb18-4cb4-810c-17775c15279e" containerName="registry-server" Feb 19 19:58:01 crc kubenswrapper[4722]: E0219 19:58:01.621046 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b93b205-cb18-4cb4-810c-17775c15279e" containerName="extract-content" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.621055 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b93b205-cb18-4cb4-810c-17775c15279e" containerName="extract-content" Feb 19 19:58:01 crc kubenswrapper[4722]: E0219 19:58:01.621077 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d75723-6d9a-4609-a294-f179d1e84710" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.621087 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d75723-6d9a-4609-a294-f179d1e84710" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 19 19:58:01 crc kubenswrapper[4722]: E0219 19:58:01.621105 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b93b205-cb18-4cb4-810c-17775c15279e" containerName="extract-utilities" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.621114 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b93b205-cb18-4cb4-810c-17775c15279e" containerName="extract-utilities" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.621376 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d75723-6d9a-4609-a294-f179d1e84710" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.621403 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b93b205-cb18-4cb4-810c-17775c15279e" containerName="registry-server" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.622296 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.625340 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.625500 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.625628 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.625746 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.625880 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.628354 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.628487 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.636316 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2"] Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.724687 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2sz7\" (UniqueName: \"kubernetes.io/projected/67f05b1f-f720-4b77-967c-2649fd05cb09-kube-api-access-c2sz7\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.724780 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.724864 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.724901 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.724947 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.725093 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.725136 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.725353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.725414 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.725462 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.725511 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827260 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827330 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827352 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827373 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827409 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827429 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827506 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827532 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827576 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.827629 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2sz7\" (UniqueName: \"kubernetes.io/projected/67f05b1f-f720-4b77-967c-2649fd05cb09-kube-api-access-c2sz7\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.828677 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.831438 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.832174 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.832402 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.832496 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.832884 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.833123 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.833595 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.834043 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.835067 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.843569 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2sz7\" (UniqueName: \"kubernetes.io/projected/67f05b1f-f720-4b77-967c-2649fd05cb09-kube-api-access-c2sz7\") pod \"nova-edpm-deployment-openstack-edpm-ipam-cdks2\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:01 crc kubenswrapper[4722]: I0219 19:58:01.955618 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 19:58:02 crc kubenswrapper[4722]: I0219 19:58:02.492329 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2"] Feb 19 19:58:02 crc kubenswrapper[4722]: I0219 19:58:02.499394 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 19:58:02 crc kubenswrapper[4722]: I0219 19:58:02.518556 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" event={"ID":"67f05b1f-f720-4b77-967c-2649fd05cb09","Type":"ContainerStarted","Data":"3c98bf09b5168c37261366bfdf36c471ccfc735e44cc5f381dee883dd636c28f"} Feb 19 19:58:03 crc kubenswrapper[4722]: I0219 19:58:03.530873 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" event={"ID":"67f05b1f-f720-4b77-967c-2649fd05cb09","Type":"ContainerStarted","Data":"15f3bf7b4aaeb40fcc3e8b1c6a8270cdc8388a64a0038be83798c738a35d98e7"} Feb 19 19:58:03 crc kubenswrapper[4722]: I0219 19:58:03.555635 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" podStartSLOduration=2.002208127 podStartE2EDuration="2.555614234s" podCreationTimestamp="2026-02-19 19:58:01 +0000 UTC" firstStartedPulling="2026-02-19 19:58:02.499135057 +0000 UTC m=+2382.111485381" lastFinishedPulling="2026-02-19 19:58:03.052541164 +0000 UTC m=+2382.664891488" observedRunningTime="2026-02-19 19:58:03.547764869 +0000 UTC m=+2383.160115193" watchObservedRunningTime="2026-02-19 19:58:03.555614234 +0000 UTC m=+2383.167964558" Feb 19 19:58:11 crc kubenswrapper[4722]: I0219 19:58:11.798957 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 19:58:11 crc kubenswrapper[4722]: I0219 19:58:11.799602 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 19:58:11 crc kubenswrapper[4722]: I0219 19:58:11.799655 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 19:58:11 crc kubenswrapper[4722]: I0219 19:58:11.800609 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 19:58:11 crc kubenswrapper[4722]: I0219 19:58:11.800682 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" gracePeriod=600 Feb 19 19:58:11 crc kubenswrapper[4722]: E0219 19:58:11.931873 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:58:12 crc kubenswrapper[4722]: I0219 19:58:12.650489 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" exitCode=0 Feb 19 19:58:12 crc kubenswrapper[4722]: I0219 19:58:12.650572 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209"} Feb 19 19:58:12 crc kubenswrapper[4722]: I0219 19:58:12.650858 4722 scope.go:117] "RemoveContainer" containerID="2cd16aaeb87a475b93fc788beab87d96c07079ee8a02a2b8bfaa32d70b168fef" Feb 19 19:58:12 crc kubenswrapper[4722]: I0219 19:58:12.651692 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 19:58:12 crc kubenswrapper[4722]: E0219 19:58:12.652086 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:58:25 crc kubenswrapper[4722]: I0219 19:58:25.071261 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 19:58:25 crc kubenswrapper[4722]: E0219 19:58:25.072039 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:58:38 crc kubenswrapper[4722]: I0219 19:58:38.071857 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 19:58:38 crc kubenswrapper[4722]: E0219 19:58:38.072769 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:58:50 crc kubenswrapper[4722]: I0219 19:58:50.071643 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 19:58:50 crc kubenswrapper[4722]: E0219 19:58:50.073117 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:59:05 crc kubenswrapper[4722]: I0219 19:59:05.071544 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 19:59:05 crc kubenswrapper[4722]: E0219 19:59:05.073180 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:59:16 crc kubenswrapper[4722]: I0219 19:59:16.071697 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 19:59:16 crc kubenswrapper[4722]: E0219 19:59:16.072534 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:59:29 crc kubenswrapper[4722]: I0219 19:59:29.071644 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 19:59:29 crc kubenswrapper[4722]: E0219 19:59:29.072624 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:59:44 crc kubenswrapper[4722]: I0219 19:59:44.071718 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 19:59:44 crc kubenswrapper[4722]: E0219 19:59:44.072545 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 19:59:55 crc kubenswrapper[4722]: I0219 19:59:55.071733 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 19:59:55 crc kubenswrapper[4722]: E0219 19:59:55.072428 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.151746 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx"] Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.153848 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.159941 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.160112 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.179915 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx"] Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.200202 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvb55\" (UniqueName: \"kubernetes.io/projected/9a47544a-9ba7-49a2-b611-fe1965ebaf42-kube-api-access-rvb55\") pod \"collect-profiles-29525520-ks7mx\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.200309 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a47544a-9ba7-49a2-b611-fe1965ebaf42-secret-volume\") pod \"collect-profiles-29525520-ks7mx\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.200437 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a47544a-9ba7-49a2-b611-fe1965ebaf42-config-volume\") pod \"collect-profiles-29525520-ks7mx\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.301630 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a47544a-9ba7-49a2-b611-fe1965ebaf42-secret-volume\") pod \"collect-profiles-29525520-ks7mx\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.301795 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a47544a-9ba7-49a2-b611-fe1965ebaf42-config-volume\") pod \"collect-profiles-29525520-ks7mx\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.301890 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvb55\" (UniqueName: \"kubernetes.io/projected/9a47544a-9ba7-49a2-b611-fe1965ebaf42-kube-api-access-rvb55\") pod \"collect-profiles-29525520-ks7mx\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.302804 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a47544a-9ba7-49a2-b611-fe1965ebaf42-config-volume\") pod \"collect-profiles-29525520-ks7mx\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.308087 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a47544a-9ba7-49a2-b611-fe1965ebaf42-secret-volume\") pod \"collect-profiles-29525520-ks7mx\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.326716 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvb55\" (UniqueName: \"kubernetes.io/projected/9a47544a-9ba7-49a2-b611-fe1965ebaf42-kube-api-access-rvb55\") pod \"collect-profiles-29525520-ks7mx\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.478446 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:00 crc kubenswrapper[4722]: I0219 20:00:00.982131 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx"] Feb 19 20:00:01 crc kubenswrapper[4722]: I0219 20:00:01.662377 4722 generic.go:334] "Generic (PLEG): container finished" podID="9a47544a-9ba7-49a2-b611-fe1965ebaf42" containerID="893890de942723d2a9f1e2aaf534f954497c2f692f3e27356bd25ee378bc213a" exitCode=0 Feb 19 20:00:01 crc kubenswrapper[4722]: I0219 20:00:01.662583 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" event={"ID":"9a47544a-9ba7-49a2-b611-fe1965ebaf42","Type":"ContainerDied","Data":"893890de942723d2a9f1e2aaf534f954497c2f692f3e27356bd25ee378bc213a"} Feb 19 20:00:01 crc kubenswrapper[4722]: I0219 20:00:01.662674 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" event={"ID":"9a47544a-9ba7-49a2-b611-fe1965ebaf42","Type":"ContainerStarted","Data":"0468828ef75d8002ff15ef245f1dfd69975f85a3deb54b48830d39542a316e7a"} Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.111537 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.262857 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a47544a-9ba7-49a2-b611-fe1965ebaf42-config-volume\") pod \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.262912 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a47544a-9ba7-49a2-b611-fe1965ebaf42-secret-volume\") pod \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.262977 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvb55\" (UniqueName: \"kubernetes.io/projected/9a47544a-9ba7-49a2-b611-fe1965ebaf42-kube-api-access-rvb55\") pod \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\" (UID: \"9a47544a-9ba7-49a2-b611-fe1965ebaf42\") " Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.264404 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a47544a-9ba7-49a2-b611-fe1965ebaf42-config-volume" (OuterVolumeSpecName: "config-volume") pod "9a47544a-9ba7-49a2-b611-fe1965ebaf42" (UID: "9a47544a-9ba7-49a2-b611-fe1965ebaf42"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.270908 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a47544a-9ba7-49a2-b611-fe1965ebaf42-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9a47544a-9ba7-49a2-b611-fe1965ebaf42" (UID: "9a47544a-9ba7-49a2-b611-fe1965ebaf42"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.271085 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a47544a-9ba7-49a2-b611-fe1965ebaf42-kube-api-access-rvb55" (OuterVolumeSpecName: "kube-api-access-rvb55") pod "9a47544a-9ba7-49a2-b611-fe1965ebaf42" (UID: "9a47544a-9ba7-49a2-b611-fe1965ebaf42"). InnerVolumeSpecName "kube-api-access-rvb55". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.366788 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9a47544a-9ba7-49a2-b611-fe1965ebaf42-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.366826 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvb55\" (UniqueName: \"kubernetes.io/projected/9a47544a-9ba7-49a2-b611-fe1965ebaf42-kube-api-access-rvb55\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.366838 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a47544a-9ba7-49a2-b611-fe1965ebaf42-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.686612 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" event={"ID":"9a47544a-9ba7-49a2-b611-fe1965ebaf42","Type":"ContainerDied","Data":"0468828ef75d8002ff15ef245f1dfd69975f85a3deb54b48830d39542a316e7a"} Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.686978 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0468828ef75d8002ff15ef245f1dfd69975f85a3deb54b48830d39542a316e7a" Feb 19 20:00:03 crc kubenswrapper[4722]: I0219 20:00:03.686732 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525520-ks7mx" Feb 19 20:00:04 crc kubenswrapper[4722]: I0219 20:00:04.198205 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7"] Feb 19 20:00:04 crc kubenswrapper[4722]: I0219 20:00:04.211286 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525475-wskf7"] Feb 19 20:00:05 crc kubenswrapper[4722]: I0219 20:00:05.085296 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d5e5981-45e4-4970-bff2-17a6087915e9" path="/var/lib/kubelet/pods/0d5e5981-45e4-4970-bff2-17a6087915e9/volumes" Feb 19 20:00:07 crc kubenswrapper[4722]: I0219 20:00:07.072165 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:00:07 crc kubenswrapper[4722]: E0219 20:00:07.073434 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:00:12 crc kubenswrapper[4722]: I0219 20:00:12.771677 4722 generic.go:334] "Generic (PLEG): container finished" podID="67f05b1f-f720-4b77-967c-2649fd05cb09" containerID="15f3bf7b4aaeb40fcc3e8b1c6a8270cdc8388a64a0038be83798c738a35d98e7" exitCode=0 Feb 19 20:00:12 crc kubenswrapper[4722]: I0219 20:00:12.771712 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" event={"ID":"67f05b1f-f720-4b77-967c-2649fd05cb09","Type":"ContainerDied","Data":"15f3bf7b4aaeb40fcc3e8b1c6a8270cdc8388a64a0038be83798c738a35d98e7"} Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.305573 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.402450 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-ssh-key-openstack-edpm-ipam\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.402678 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-0\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.402729 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-extra-config-0\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.402757 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-0\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.402818 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-1\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.402877 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-2\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.402913 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-3\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.402932 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-inventory\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.402960 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2sz7\" (UniqueName: \"kubernetes.io/projected/67f05b1f-f720-4b77-967c-2649fd05cb09-kube-api-access-c2sz7\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.402976 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-combined-ca-bundle\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.403030 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-1\") pod \"67f05b1f-f720-4b77-967c-2649fd05cb09\" (UID: \"67f05b1f-f720-4b77-967c-2649fd05cb09\") " Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.411545 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.437354 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67f05b1f-f720-4b77-967c-2649fd05cb09-kube-api-access-c2sz7" (OuterVolumeSpecName: "kube-api-access-c2sz7") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "kube-api-access-c2sz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.437659 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.439341 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.440210 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-inventory" (OuterVolumeSpecName: "inventory") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.442715 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.444481 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.446265 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.448525 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.455742 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.463772 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "67f05b1f-f720-4b77-967c-2649fd05cb09" (UID: "67f05b1f-f720-4b77-967c-2649fd05cb09"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505266 4722 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505307 4722 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505319 4722 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505332 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505345 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2sz7\" (UniqueName: \"kubernetes.io/projected/67f05b1f-f720-4b77-967c-2649fd05cb09-kube-api-access-c2sz7\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505356 4722 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505369 4722 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505380 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505391 4722 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505403 4722 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.505414 4722 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/67f05b1f-f720-4b77-967c-2649fd05cb09-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.790614 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" event={"ID":"67f05b1f-f720-4b77-967c-2649fd05cb09","Type":"ContainerDied","Data":"3c98bf09b5168c37261366bfdf36c471ccfc735e44cc5f381dee883dd636c28f"} Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.791392 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c98bf09b5168c37261366bfdf36c471ccfc735e44cc5f381dee883dd636c28f" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.790704 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-cdks2" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.903022 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp"] Feb 19 20:00:14 crc kubenswrapper[4722]: E0219 20:00:14.903531 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a47544a-9ba7-49a2-b611-fe1965ebaf42" containerName="collect-profiles" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.903553 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a47544a-9ba7-49a2-b611-fe1965ebaf42" containerName="collect-profiles" Feb 19 20:00:14 crc kubenswrapper[4722]: E0219 20:00:14.903580 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67f05b1f-f720-4b77-967c-2649fd05cb09" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.903588 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="67f05b1f-f720-4b77-967c-2649fd05cb09" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.903833 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="67f05b1f-f720-4b77-967c-2649fd05cb09" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.904057 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a47544a-9ba7-49a2-b611-fe1965ebaf42" containerName="collect-profiles" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.904934 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.910975 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.911262 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.911445 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.911602 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.911711 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-jv7tz" Feb 19 20:00:14 crc kubenswrapper[4722]: I0219 20:00:14.918447 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp"] Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.014289 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.014353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.014407 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.014709 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.014786 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8jjk\" (UniqueName: \"kubernetes.io/projected/4a2c74da-6ac0-4070-9f5a-577bc5c64771-kube-api-access-q8jjk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.014966 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.015049 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.117070 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.117206 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.117296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.117322 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8jjk\" (UniqueName: \"kubernetes.io/projected/4a2c74da-6ac0-4070-9f5a-577bc5c64771-kube-api-access-q8jjk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.117380 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.117415 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.117509 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.130317 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.131728 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.132299 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.132817 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.137192 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.137737 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.197922 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8jjk\" (UniqueName: \"kubernetes.io/projected/4a2c74da-6ac0-4070-9f5a-577bc5c64771-kube-api-access-q8jjk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.229471 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:00:15 crc kubenswrapper[4722]: W0219 20:00:15.822421 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a2c74da_6ac0_4070_9f5a_577bc5c64771.slice/crio-02d6b2604c08a66cc120ea55d8fc798b991254031bbb966ca2783a3b00293769 WatchSource:0}: Error finding container 02d6b2604c08a66cc120ea55d8fc798b991254031bbb966ca2783a3b00293769: Status 404 returned error can't find the container with id 02d6b2604c08a66cc120ea55d8fc798b991254031bbb966ca2783a3b00293769 Feb 19 20:00:15 crc kubenswrapper[4722]: I0219 20:00:15.831522 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp"] Feb 19 20:00:16 crc kubenswrapper[4722]: I0219 20:00:16.818856 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" event={"ID":"4a2c74da-6ac0-4070-9f5a-577bc5c64771","Type":"ContainerStarted","Data":"10a106f9d091403365a306069c9198e890b02573d70dd878e404c55981b55a77"} Feb 19 20:00:16 crc kubenswrapper[4722]: I0219 20:00:16.819469 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" event={"ID":"4a2c74da-6ac0-4070-9f5a-577bc5c64771","Type":"ContainerStarted","Data":"02d6b2604c08a66cc120ea55d8fc798b991254031bbb966ca2783a3b00293769"} Feb 19 20:00:16 crc kubenswrapper[4722]: I0219 20:00:16.845953 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" podStartSLOduration=2.433483937 podStartE2EDuration="2.845936556s" podCreationTimestamp="2026-02-19 20:00:14 +0000 UTC" firstStartedPulling="2026-02-19 20:00:15.824382008 +0000 UTC m=+2515.436732332" lastFinishedPulling="2026-02-19 20:00:16.236834627 +0000 UTC m=+2515.849184951" observedRunningTime="2026-02-19 20:00:16.84545164 +0000 UTC m=+2516.457801994" watchObservedRunningTime="2026-02-19 20:00:16.845936556 +0000 UTC m=+2516.458286880" Feb 19 20:00:22 crc kubenswrapper[4722]: I0219 20:00:22.071844 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:00:22 crc kubenswrapper[4722]: E0219 20:00:22.072677 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:00:37 crc kubenswrapper[4722]: I0219 20:00:37.071726 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:00:37 crc kubenswrapper[4722]: E0219 20:00:37.072630 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.295017 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hgj8p"] Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.298449 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.331905 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgj8p"] Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.415003 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2xxs\" (UniqueName: \"kubernetes.io/projected/a5bc0218-2703-4ebb-86a4-8bbcffe69121-kube-api-access-w2xxs\") pod \"redhat-marketplace-hgj8p\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.415205 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-utilities\") pod \"redhat-marketplace-hgj8p\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.415242 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-catalog-content\") pod \"redhat-marketplace-hgj8p\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.517133 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2xxs\" (UniqueName: \"kubernetes.io/projected/a5bc0218-2703-4ebb-86a4-8bbcffe69121-kube-api-access-w2xxs\") pod \"redhat-marketplace-hgj8p\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.517299 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-utilities\") pod \"redhat-marketplace-hgj8p\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.517337 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-catalog-content\") pod \"redhat-marketplace-hgj8p\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.517777 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-utilities\") pod \"redhat-marketplace-hgj8p\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.517919 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-catalog-content\") pod \"redhat-marketplace-hgj8p\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.542917 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2xxs\" (UniqueName: \"kubernetes.io/projected/a5bc0218-2703-4ebb-86a4-8bbcffe69121-kube-api-access-w2xxs\") pod \"redhat-marketplace-hgj8p\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:43 crc kubenswrapper[4722]: I0219 20:00:43.632617 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:44 crc kubenswrapper[4722]: I0219 20:00:44.177869 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgj8p"] Feb 19 20:00:45 crc kubenswrapper[4722]: I0219 20:00:45.132911 4722 generic.go:334] "Generic (PLEG): container finished" podID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerID="87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c" exitCode=0 Feb 19 20:00:45 crc kubenswrapper[4722]: I0219 20:00:45.132967 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgj8p" event={"ID":"a5bc0218-2703-4ebb-86a4-8bbcffe69121","Type":"ContainerDied","Data":"87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c"} Feb 19 20:00:45 crc kubenswrapper[4722]: I0219 20:00:45.133289 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgj8p" event={"ID":"a5bc0218-2703-4ebb-86a4-8bbcffe69121","Type":"ContainerStarted","Data":"182cc0081f8daaf558c0e18ef713d4c999f04aa48ebdb22e2a458e3216bb9ceb"} Feb 19 20:00:46 crc kubenswrapper[4722]: I0219 20:00:46.174953 4722 scope.go:117] "RemoveContainer" containerID="2e394652aecdb0cb849b3a87f5903a2cfceab4d4b8a685caa540a2bfe431a66b" Feb 19 20:00:47 crc kubenswrapper[4722]: I0219 20:00:47.154475 4722 generic.go:334] "Generic (PLEG): container finished" podID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerID="6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec" exitCode=0 Feb 19 20:00:47 crc kubenswrapper[4722]: I0219 20:00:47.154563 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgj8p" event={"ID":"a5bc0218-2703-4ebb-86a4-8bbcffe69121","Type":"ContainerDied","Data":"6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec"} Feb 19 20:00:48 crc kubenswrapper[4722]: I0219 20:00:48.167240 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgj8p" event={"ID":"a5bc0218-2703-4ebb-86a4-8bbcffe69121","Type":"ContainerStarted","Data":"8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833"} Feb 19 20:00:48 crc kubenswrapper[4722]: I0219 20:00:48.194977 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hgj8p" podStartSLOduration=2.756886337 podStartE2EDuration="5.194953986s" podCreationTimestamp="2026-02-19 20:00:43 +0000 UTC" firstStartedPulling="2026-02-19 20:00:45.135232747 +0000 UTC m=+2544.747583071" lastFinishedPulling="2026-02-19 20:00:47.573300396 +0000 UTC m=+2547.185650720" observedRunningTime="2026-02-19 20:00:48.185682758 +0000 UTC m=+2547.798033092" watchObservedRunningTime="2026-02-19 20:00:48.194953986 +0000 UTC m=+2547.807304310" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.077287 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:00:51 crc kubenswrapper[4722]: E0219 20:00:51.078061 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.705648 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w5nsl"] Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.708450 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.720385 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w5nsl"] Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.786633 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-utilities\") pod \"certified-operators-w5nsl\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.786791 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jt5h\" (UniqueName: \"kubernetes.io/projected/6f75f82e-3471-4ff7-92e0-d758f55f5394-kube-api-access-2jt5h\") pod \"certified-operators-w5nsl\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.786966 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-catalog-content\") pod \"certified-operators-w5nsl\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.889146 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-utilities\") pod \"certified-operators-w5nsl\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.889355 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jt5h\" (UniqueName: \"kubernetes.io/projected/6f75f82e-3471-4ff7-92e0-d758f55f5394-kube-api-access-2jt5h\") pod \"certified-operators-w5nsl\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.889438 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-catalog-content\") pod \"certified-operators-w5nsl\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.889714 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-utilities\") pod \"certified-operators-w5nsl\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.889835 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-catalog-content\") pod \"certified-operators-w5nsl\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:51 crc kubenswrapper[4722]: I0219 20:00:51.914038 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jt5h\" (UniqueName: \"kubernetes.io/projected/6f75f82e-3471-4ff7-92e0-d758f55f5394-kube-api-access-2jt5h\") pod \"certified-operators-w5nsl\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:52 crc kubenswrapper[4722]: I0219 20:00:52.045195 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:00:52 crc kubenswrapper[4722]: I0219 20:00:52.561572 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w5nsl"] Feb 19 20:00:53 crc kubenswrapper[4722]: I0219 20:00:53.218316 4722 generic.go:334] "Generic (PLEG): container finished" podID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerID="cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce" exitCode=0 Feb 19 20:00:53 crc kubenswrapper[4722]: I0219 20:00:53.218366 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5nsl" event={"ID":"6f75f82e-3471-4ff7-92e0-d758f55f5394","Type":"ContainerDied","Data":"cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce"} Feb 19 20:00:53 crc kubenswrapper[4722]: I0219 20:00:53.218572 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5nsl" event={"ID":"6f75f82e-3471-4ff7-92e0-d758f55f5394","Type":"ContainerStarted","Data":"c0e49dde2201530e923ca42ac66ad72cb85c66f2f04363fd7cf12ad95275cc23"} Feb 19 20:00:53 crc kubenswrapper[4722]: I0219 20:00:53.633699 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:53 crc kubenswrapper[4722]: I0219 20:00:53.633991 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:53 crc kubenswrapper[4722]: I0219 20:00:53.691830 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:54 crc kubenswrapper[4722]: I0219 20:00:54.233763 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5nsl" event={"ID":"6f75f82e-3471-4ff7-92e0-d758f55f5394","Type":"ContainerStarted","Data":"3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed"} Feb 19 20:00:54 crc kubenswrapper[4722]: I0219 20:00:54.284641 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.081375 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgj8p"] Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.255493 4722 generic.go:334] "Generic (PLEG): container finished" podID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerID="3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed" exitCode=0 Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.255573 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5nsl" event={"ID":"6f75f82e-3471-4ff7-92e0-d758f55f5394","Type":"ContainerDied","Data":"3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed"} Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.255740 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hgj8p" podUID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerName="registry-server" containerID="cri-o://8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833" gracePeriod=2 Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.871842 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.906711 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-utilities\") pod \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.906784 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2xxs\" (UniqueName: \"kubernetes.io/projected/a5bc0218-2703-4ebb-86a4-8bbcffe69121-kube-api-access-w2xxs\") pod \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.906893 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-catalog-content\") pod \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\" (UID: \"a5bc0218-2703-4ebb-86a4-8bbcffe69121\") " Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.908511 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-utilities" (OuterVolumeSpecName: "utilities") pod "a5bc0218-2703-4ebb-86a4-8bbcffe69121" (UID: "a5bc0218-2703-4ebb-86a4-8bbcffe69121"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.929329 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5bc0218-2703-4ebb-86a4-8bbcffe69121" (UID: "a5bc0218-2703-4ebb-86a4-8bbcffe69121"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:00:56 crc kubenswrapper[4722]: I0219 20:00:56.938184 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5bc0218-2703-4ebb-86a4-8bbcffe69121-kube-api-access-w2xxs" (OuterVolumeSpecName: "kube-api-access-w2xxs") pod "a5bc0218-2703-4ebb-86a4-8bbcffe69121" (UID: "a5bc0218-2703-4ebb-86a4-8bbcffe69121"). InnerVolumeSpecName "kube-api-access-w2xxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.008609 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.008640 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5bc0218-2703-4ebb-86a4-8bbcffe69121-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.008649 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2xxs\" (UniqueName: \"kubernetes.io/projected/a5bc0218-2703-4ebb-86a4-8bbcffe69121-kube-api-access-w2xxs\") on node \"crc\" DevicePath \"\"" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.266926 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgj8p" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.266995 4722 generic.go:334] "Generic (PLEG): container finished" podID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerID="8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833" exitCode=0 Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.267056 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgj8p" event={"ID":"a5bc0218-2703-4ebb-86a4-8bbcffe69121","Type":"ContainerDied","Data":"8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833"} Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.267102 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgj8p" event={"ID":"a5bc0218-2703-4ebb-86a4-8bbcffe69121","Type":"ContainerDied","Data":"182cc0081f8daaf558c0e18ef713d4c999f04aa48ebdb22e2a458e3216bb9ceb"} Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.267125 4722 scope.go:117] "RemoveContainer" containerID="8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.270717 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5nsl" event={"ID":"6f75f82e-3471-4ff7-92e0-d758f55f5394","Type":"ContainerStarted","Data":"150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134"} Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.292078 4722 scope.go:117] "RemoveContainer" containerID="6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.307514 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgj8p"] Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.316835 4722 scope.go:117] "RemoveContainer" containerID="87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.324544 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgj8p"] Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.329232 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w5nsl" podStartSLOduration=2.818583787 podStartE2EDuration="6.329205229s" podCreationTimestamp="2026-02-19 20:00:51 +0000 UTC" firstStartedPulling="2026-02-19 20:00:53.220178829 +0000 UTC m=+2552.832529143" lastFinishedPulling="2026-02-19 20:00:56.730800251 +0000 UTC m=+2556.343150585" observedRunningTime="2026-02-19 20:00:57.307483223 +0000 UTC m=+2556.919833547" watchObservedRunningTime="2026-02-19 20:00:57.329205229 +0000 UTC m=+2556.941555563" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.345032 4722 scope.go:117] "RemoveContainer" containerID="8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833" Feb 19 20:00:57 crc kubenswrapper[4722]: E0219 20:00:57.345735 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833\": container with ID starting with 8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833 not found: ID does not exist" containerID="8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.345848 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833"} err="failed to get container status \"8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833\": rpc error: code = NotFound desc = could not find container \"8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833\": container with ID starting with 8a06fe018a5f220a7ceae330ee2bba0d8ea87d576b71af88c2848fc17186b833 not found: ID does not exist" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.345904 4722 scope.go:117] "RemoveContainer" containerID="6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec" Feb 19 20:00:57 crc kubenswrapper[4722]: E0219 20:00:57.346585 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec\": container with ID starting with 6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec not found: ID does not exist" containerID="6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.346636 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec"} err="failed to get container status \"6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec\": rpc error: code = NotFound desc = could not find container \"6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec\": container with ID starting with 6362c4a3a099cc1fd68f3e97898170d6050e7402517ba158d7cbfe91bd4b28ec not found: ID does not exist" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.346659 4722 scope.go:117] "RemoveContainer" containerID="87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c" Feb 19 20:00:57 crc kubenswrapper[4722]: E0219 20:00:57.347013 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c\": container with ID starting with 87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c not found: ID does not exist" containerID="87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c" Feb 19 20:00:57 crc kubenswrapper[4722]: I0219 20:00:57.347048 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c"} err="failed to get container status \"87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c\": rpc error: code = NotFound desc = could not find container \"87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c\": container with ID starting with 87a164e9b3fe8ee8dcc27b8689e5c491f19676ce31a97caa0b58788feed5f62c not found: ID does not exist" Feb 19 20:00:59 crc kubenswrapper[4722]: I0219 20:00:59.081655 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" path="/var/lib/kubelet/pods/a5bc0218-2703-4ebb-86a4-8bbcffe69121/volumes" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.167867 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29525521-9cbvt"] Feb 19 20:01:00 crc kubenswrapper[4722]: E0219 20:01:00.168534 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerName="registry-server" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.168558 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerName="registry-server" Feb 19 20:01:00 crc kubenswrapper[4722]: E0219 20:01:00.168600 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerName="extract-utilities" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.168611 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerName="extract-utilities" Feb 19 20:01:00 crc kubenswrapper[4722]: E0219 20:01:00.168641 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerName="extract-content" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.168653 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerName="extract-content" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.168995 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5bc0218-2703-4ebb-86a4-8bbcffe69121" containerName="registry-server" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.174596 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.191911 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525521-9cbvt"] Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.294506 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-fernet-keys\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.294619 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqw5z\" (UniqueName: \"kubernetes.io/projected/973609f7-b4ce-41f2-ad80-83b1b1593e2f-kube-api-access-nqw5z\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.294658 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-config-data\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.294718 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-combined-ca-bundle\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.397327 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqw5z\" (UniqueName: \"kubernetes.io/projected/973609f7-b4ce-41f2-ad80-83b1b1593e2f-kube-api-access-nqw5z\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.397573 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-config-data\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.397738 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-combined-ca-bundle\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.397966 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-fernet-keys\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.403776 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-fernet-keys\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.417134 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-config-data\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.417200 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-combined-ca-bundle\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.420893 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqw5z\" (UniqueName: \"kubernetes.io/projected/973609f7-b4ce-41f2-ad80-83b1b1593e2f-kube-api-access-nqw5z\") pod \"keystone-cron-29525521-9cbvt\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.497228 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:00 crc kubenswrapper[4722]: I0219 20:01:00.941760 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29525521-9cbvt"] Feb 19 20:01:01 crc kubenswrapper[4722]: I0219 20:01:01.315921 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525521-9cbvt" event={"ID":"973609f7-b4ce-41f2-ad80-83b1b1593e2f","Type":"ContainerStarted","Data":"342f08efdd598eafc0f8ea31b816d469b95fae87bad643fc4584f90d98ad449d"} Feb 19 20:01:01 crc kubenswrapper[4722]: I0219 20:01:01.315970 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525521-9cbvt" event={"ID":"973609f7-b4ce-41f2-ad80-83b1b1593e2f","Type":"ContainerStarted","Data":"2ecac706e54aad4a3908fec9e0940aa1fccbea84739f7ef2dc757b6179a2f248"} Feb 19 20:01:01 crc kubenswrapper[4722]: I0219 20:01:01.336006 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29525521-9cbvt" podStartSLOduration=1.335988681 podStartE2EDuration="1.335988681s" podCreationTimestamp="2026-02-19 20:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 20:01:01.33403185 +0000 UTC m=+2560.946382174" watchObservedRunningTime="2026-02-19 20:01:01.335988681 +0000 UTC m=+2560.948338995" Feb 19 20:01:02 crc kubenswrapper[4722]: I0219 20:01:02.047332 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:01:02 crc kubenswrapper[4722]: I0219 20:01:02.048375 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:01:02 crc kubenswrapper[4722]: I0219 20:01:02.134125 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:01:02 crc kubenswrapper[4722]: I0219 20:01:02.379989 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:01:02 crc kubenswrapper[4722]: I0219 20:01:02.437598 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w5nsl"] Feb 19 20:01:04 crc kubenswrapper[4722]: I0219 20:01:04.070914 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:01:04 crc kubenswrapper[4722]: E0219 20:01:04.071454 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:01:04 crc kubenswrapper[4722]: I0219 20:01:04.350213 4722 generic.go:334] "Generic (PLEG): container finished" podID="973609f7-b4ce-41f2-ad80-83b1b1593e2f" containerID="342f08efdd598eafc0f8ea31b816d469b95fae87bad643fc4584f90d98ad449d" exitCode=0 Feb 19 20:01:04 crc kubenswrapper[4722]: I0219 20:01:04.350285 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525521-9cbvt" event={"ID":"973609f7-b4ce-41f2-ad80-83b1b1593e2f","Type":"ContainerDied","Data":"342f08efdd598eafc0f8ea31b816d469b95fae87bad643fc4584f90d98ad449d"} Feb 19 20:01:04 crc kubenswrapper[4722]: I0219 20:01:04.350426 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w5nsl" podUID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerName="registry-server" containerID="cri-o://150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134" gracePeriod=2 Feb 19 20:01:04 crc kubenswrapper[4722]: I0219 20:01:04.911641 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:01:04 crc kubenswrapper[4722]: I0219 20:01:04.989353 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-utilities\") pod \"6f75f82e-3471-4ff7-92e0-d758f55f5394\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " Feb 19 20:01:04 crc kubenswrapper[4722]: I0219 20:01:04.989554 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jt5h\" (UniqueName: \"kubernetes.io/projected/6f75f82e-3471-4ff7-92e0-d758f55f5394-kube-api-access-2jt5h\") pod \"6f75f82e-3471-4ff7-92e0-d758f55f5394\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " Feb 19 20:01:04 crc kubenswrapper[4722]: I0219 20:01:04.989691 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-catalog-content\") pod \"6f75f82e-3471-4ff7-92e0-d758f55f5394\" (UID: \"6f75f82e-3471-4ff7-92e0-d758f55f5394\") " Feb 19 20:01:04 crc kubenswrapper[4722]: I0219 20:01:04.991043 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-utilities" (OuterVolumeSpecName: "utilities") pod "6f75f82e-3471-4ff7-92e0-d758f55f5394" (UID: "6f75f82e-3471-4ff7-92e0-d758f55f5394"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:01:04 crc kubenswrapper[4722]: I0219 20:01:04.996431 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f75f82e-3471-4ff7-92e0-d758f55f5394-kube-api-access-2jt5h" (OuterVolumeSpecName: "kube-api-access-2jt5h") pod "6f75f82e-3471-4ff7-92e0-d758f55f5394" (UID: "6f75f82e-3471-4ff7-92e0-d758f55f5394"). InnerVolumeSpecName "kube-api-access-2jt5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.041989 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f75f82e-3471-4ff7-92e0-d758f55f5394" (UID: "6f75f82e-3471-4ff7-92e0-d758f55f5394"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.092849 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.093872 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jt5h\" (UniqueName: \"kubernetes.io/projected/6f75f82e-3471-4ff7-92e0-d758f55f5394-kube-api-access-2jt5h\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.093948 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f75f82e-3471-4ff7-92e0-d758f55f5394-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.360945 4722 generic.go:334] "Generic (PLEG): container finished" podID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerID="150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134" exitCode=0 Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.361021 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w5nsl" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.361032 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5nsl" event={"ID":"6f75f82e-3471-4ff7-92e0-d758f55f5394","Type":"ContainerDied","Data":"150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134"} Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.361439 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w5nsl" event={"ID":"6f75f82e-3471-4ff7-92e0-d758f55f5394","Type":"ContainerDied","Data":"c0e49dde2201530e923ca42ac66ad72cb85c66f2f04363fd7cf12ad95275cc23"} Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.361475 4722 scope.go:117] "RemoveContainer" containerID="150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.391991 4722 scope.go:117] "RemoveContainer" containerID="3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.392029 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w5nsl"] Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.402844 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w5nsl"] Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.410344 4722 scope.go:117] "RemoveContainer" containerID="cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.463563 4722 scope.go:117] "RemoveContainer" containerID="150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134" Feb 19 20:01:05 crc kubenswrapper[4722]: E0219 20:01:05.466806 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134\": container with ID starting with 150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134 not found: ID does not exist" containerID="150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.466885 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134"} err="failed to get container status \"150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134\": rpc error: code = NotFound desc = could not find container \"150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134\": container with ID starting with 150200c571c5eb2d7b3871b9106100a2d114186dbcf243d5546a9c9956b26134 not found: ID does not exist" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.466926 4722 scope.go:117] "RemoveContainer" containerID="3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed" Feb 19 20:01:05 crc kubenswrapper[4722]: E0219 20:01:05.467279 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed\": container with ID starting with 3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed not found: ID does not exist" containerID="3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.467323 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed"} err="failed to get container status \"3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed\": rpc error: code = NotFound desc = could not find container \"3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed\": container with ID starting with 3e4b96deb06056978e3538dc06c596eb819baaef981b42269d7b94eabda388ed not found: ID does not exist" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.467349 4722 scope.go:117] "RemoveContainer" containerID="cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce" Feb 19 20:01:05 crc kubenswrapper[4722]: E0219 20:01:05.467626 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce\": container with ID starting with cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce not found: ID does not exist" containerID="cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.467671 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce"} err="failed to get container status \"cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce\": rpc error: code = NotFound desc = could not find container \"cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce\": container with ID starting with cf925d62703c2f3b64e27a818fac994c19b4eb464d54f7d3dad26c92d8c77fce not found: ID does not exist" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.799971 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.805269 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-config-data\") pod \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.805358 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqw5z\" (UniqueName: \"kubernetes.io/projected/973609f7-b4ce-41f2-ad80-83b1b1593e2f-kube-api-access-nqw5z\") pod \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.805387 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-fernet-keys\") pod \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.805447 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-combined-ca-bundle\") pod \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\" (UID: \"973609f7-b4ce-41f2-ad80-83b1b1593e2f\") " Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.811972 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "973609f7-b4ce-41f2-ad80-83b1b1593e2f" (UID: "973609f7-b4ce-41f2-ad80-83b1b1593e2f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.812033 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/973609f7-b4ce-41f2-ad80-83b1b1593e2f-kube-api-access-nqw5z" (OuterVolumeSpecName: "kube-api-access-nqw5z") pod "973609f7-b4ce-41f2-ad80-83b1b1593e2f" (UID: "973609f7-b4ce-41f2-ad80-83b1b1593e2f"). InnerVolumeSpecName "kube-api-access-nqw5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.853456 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "973609f7-b4ce-41f2-ad80-83b1b1593e2f" (UID: "973609f7-b4ce-41f2-ad80-83b1b1593e2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.865289 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-config-data" (OuterVolumeSpecName: "config-data") pod "973609f7-b4ce-41f2-ad80-83b1b1593e2f" (UID: "973609f7-b4ce-41f2-ad80-83b1b1593e2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.907239 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.907270 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqw5z\" (UniqueName: \"kubernetes.io/projected/973609f7-b4ce-41f2-ad80-83b1b1593e2f-kube-api-access-nqw5z\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.907282 4722 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:05 crc kubenswrapper[4722]: I0219 20:01:05.907291 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/973609f7-b4ce-41f2-ad80-83b1b1593e2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:06 crc kubenswrapper[4722]: I0219 20:01:06.385933 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29525521-9cbvt" event={"ID":"973609f7-b4ce-41f2-ad80-83b1b1593e2f","Type":"ContainerDied","Data":"2ecac706e54aad4a3908fec9e0940aa1fccbea84739f7ef2dc757b6179a2f248"} Feb 19 20:01:06 crc kubenswrapper[4722]: I0219 20:01:06.385983 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ecac706e54aad4a3908fec9e0940aa1fccbea84739f7ef2dc757b6179a2f248" Feb 19 20:01:06 crc kubenswrapper[4722]: I0219 20:01:06.386089 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29525521-9cbvt" Feb 19 20:01:07 crc kubenswrapper[4722]: I0219 20:01:07.083425 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f75f82e-3471-4ff7-92e0-d758f55f5394" path="/var/lib/kubelet/pods/6f75f82e-3471-4ff7-92e0-d758f55f5394/volumes" Feb 19 20:01:16 crc kubenswrapper[4722]: I0219 20:01:16.071777 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:01:16 crc kubenswrapper[4722]: E0219 20:01:16.072902 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:01:27 crc kubenswrapper[4722]: I0219 20:01:27.071763 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:01:27 crc kubenswrapper[4722]: E0219 20:01:27.072604 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.769387 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n2rkv"] Feb 19 20:01:28 crc kubenswrapper[4722]: E0219 20:01:28.771969 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerName="extract-utilities" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.771989 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerName="extract-utilities" Feb 19 20:01:28 crc kubenswrapper[4722]: E0219 20:01:28.772001 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerName="registry-server" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.772008 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerName="registry-server" Feb 19 20:01:28 crc kubenswrapper[4722]: E0219 20:01:28.772053 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="973609f7-b4ce-41f2-ad80-83b1b1593e2f" containerName="keystone-cron" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.772062 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="973609f7-b4ce-41f2-ad80-83b1b1593e2f" containerName="keystone-cron" Feb 19 20:01:28 crc kubenswrapper[4722]: E0219 20:01:28.772094 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerName="extract-content" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.772100 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerName="extract-content" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.772436 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f75f82e-3471-4ff7-92e0-d758f55f5394" containerName="registry-server" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.772472 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="973609f7-b4ce-41f2-ad80-83b1b1593e2f" containerName="keystone-cron" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.787290 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.802369 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n2rkv"] Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.857621 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-utilities\") pod \"redhat-operators-n2rkv\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.858028 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmhbc\" (UniqueName: \"kubernetes.io/projected/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-kube-api-access-qmhbc\") pod \"redhat-operators-n2rkv\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.858230 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-catalog-content\") pod \"redhat-operators-n2rkv\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.959956 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmhbc\" (UniqueName: \"kubernetes.io/projected/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-kube-api-access-qmhbc\") pod \"redhat-operators-n2rkv\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.960131 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-catalog-content\") pod \"redhat-operators-n2rkv\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.960213 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-utilities\") pod \"redhat-operators-n2rkv\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.960630 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-utilities\") pod \"redhat-operators-n2rkv\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.960731 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-catalog-content\") pod \"redhat-operators-n2rkv\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:28 crc kubenswrapper[4722]: I0219 20:01:28.993123 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmhbc\" (UniqueName: \"kubernetes.io/projected/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-kube-api-access-qmhbc\") pod \"redhat-operators-n2rkv\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:29 crc kubenswrapper[4722]: I0219 20:01:29.114135 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:29 crc kubenswrapper[4722]: I0219 20:01:29.692934 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n2rkv"] Feb 19 20:01:30 crc kubenswrapper[4722]: I0219 20:01:30.651607 4722 generic.go:334] "Generic (PLEG): container finished" podID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerID="7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9" exitCode=0 Feb 19 20:01:30 crc kubenswrapper[4722]: I0219 20:01:30.651683 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2rkv" event={"ID":"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0","Type":"ContainerDied","Data":"7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9"} Feb 19 20:01:30 crc kubenswrapper[4722]: I0219 20:01:30.651928 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2rkv" event={"ID":"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0","Type":"ContainerStarted","Data":"0ae6aa45c815f48f8f0269466e13089e0f93eaf09184b6a3ba49dc588c51d376"} Feb 19 20:01:31 crc kubenswrapper[4722]: I0219 20:01:31.665470 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2rkv" event={"ID":"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0","Type":"ContainerStarted","Data":"94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff"} Feb 19 20:01:35 crc kubenswrapper[4722]: I0219 20:01:35.707485 4722 generic.go:334] "Generic (PLEG): container finished" podID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerID="94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff" exitCode=0 Feb 19 20:01:35 crc kubenswrapper[4722]: I0219 20:01:35.707560 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2rkv" event={"ID":"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0","Type":"ContainerDied","Data":"94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff"} Feb 19 20:01:36 crc kubenswrapper[4722]: I0219 20:01:36.720868 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2rkv" event={"ID":"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0","Type":"ContainerStarted","Data":"9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0"} Feb 19 20:01:36 crc kubenswrapper[4722]: I0219 20:01:36.747583 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n2rkv" podStartSLOduration=3.219179615 podStartE2EDuration="8.747564606s" podCreationTimestamp="2026-02-19 20:01:28 +0000 UTC" firstStartedPulling="2026-02-19 20:01:30.653422492 +0000 UTC m=+2590.265772816" lastFinishedPulling="2026-02-19 20:01:36.181807483 +0000 UTC m=+2595.794157807" observedRunningTime="2026-02-19 20:01:36.738461583 +0000 UTC m=+2596.350811907" watchObservedRunningTime="2026-02-19 20:01:36.747564606 +0000 UTC m=+2596.359914930" Feb 19 20:01:39 crc kubenswrapper[4722]: I0219 20:01:39.071842 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:01:39 crc kubenswrapper[4722]: E0219 20:01:39.072501 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:01:39 crc kubenswrapper[4722]: I0219 20:01:39.114472 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:39 crc kubenswrapper[4722]: I0219 20:01:39.114512 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:40 crc kubenswrapper[4722]: I0219 20:01:40.172901 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n2rkv" podUID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerName="registry-server" probeResult="failure" output=< Feb 19 20:01:40 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 19 20:01:40 crc kubenswrapper[4722]: > Feb 19 20:01:49 crc kubenswrapper[4722]: I0219 20:01:49.194475 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:49 crc kubenswrapper[4722]: I0219 20:01:49.270078 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:49 crc kubenswrapper[4722]: I0219 20:01:49.444326 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n2rkv"] Feb 19 20:01:50 crc kubenswrapper[4722]: I0219 20:01:50.843209 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n2rkv" podUID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerName="registry-server" containerID="cri-o://9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0" gracePeriod=2 Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.072319 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:01:51 crc kubenswrapper[4722]: E0219 20:01:51.072937 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.403538 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.523334 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-catalog-content\") pod \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.523722 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-utilities\") pod \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.523885 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmhbc\" (UniqueName: \"kubernetes.io/projected/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-kube-api-access-qmhbc\") pod \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\" (UID: \"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0\") " Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.525534 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-utilities" (OuterVolumeSpecName: "utilities") pod "9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" (UID: "9239d54c-2bd6-4d15-8a5d-c40d2bf369b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.530211 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-kube-api-access-qmhbc" (OuterVolumeSpecName: "kube-api-access-qmhbc") pod "9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" (UID: "9239d54c-2bd6-4d15-8a5d-c40d2bf369b0"). InnerVolumeSpecName "kube-api-access-qmhbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.626187 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.626222 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmhbc\" (UniqueName: \"kubernetes.io/projected/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-kube-api-access-qmhbc\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.655893 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" (UID: "9239d54c-2bd6-4d15-8a5d-c40d2bf369b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.728928 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.856118 4722 generic.go:334] "Generic (PLEG): container finished" podID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerID="9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0" exitCode=0 Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.856218 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2rkv" event={"ID":"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0","Type":"ContainerDied","Data":"9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0"} Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.856277 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n2rkv" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.856304 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n2rkv" event={"ID":"9239d54c-2bd6-4d15-8a5d-c40d2bf369b0","Type":"ContainerDied","Data":"0ae6aa45c815f48f8f0269466e13089e0f93eaf09184b6a3ba49dc588c51d376"} Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.856328 4722 scope.go:117] "RemoveContainer" containerID="9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.880064 4722 scope.go:117] "RemoveContainer" containerID="94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.901697 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n2rkv"] Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.914893 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n2rkv"] Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.920833 4722 scope.go:117] "RemoveContainer" containerID="7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.979131 4722 scope.go:117] "RemoveContainer" containerID="9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0" Feb 19 20:01:51 crc kubenswrapper[4722]: E0219 20:01:51.979533 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0\": container with ID starting with 9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0 not found: ID does not exist" containerID="9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.979560 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0"} err="failed to get container status \"9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0\": rpc error: code = NotFound desc = could not find container \"9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0\": container with ID starting with 9541800bdfdb3d16ef86643bfe1871d3d11a23ba6571337b74a0d72934eb2fd0 not found: ID does not exist" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.979633 4722 scope.go:117] "RemoveContainer" containerID="94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff" Feb 19 20:01:51 crc kubenswrapper[4722]: E0219 20:01:51.980109 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff\": container with ID starting with 94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff not found: ID does not exist" containerID="94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.980130 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff"} err="failed to get container status \"94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff\": rpc error: code = NotFound desc = could not find container \"94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff\": container with ID starting with 94dfacfc019bd3b56fe354999e0e9af63610d8c1e1c873f3b88bdbe3aff167ff not found: ID does not exist" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.980142 4722 scope.go:117] "RemoveContainer" containerID="7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9" Feb 19 20:01:51 crc kubenswrapper[4722]: E0219 20:01:51.980381 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9\": container with ID starting with 7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9 not found: ID does not exist" containerID="7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9" Feb 19 20:01:51 crc kubenswrapper[4722]: I0219 20:01:51.980400 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9"} err="failed to get container status \"7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9\": rpc error: code = NotFound desc = could not find container \"7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9\": container with ID starting with 7927ce951ee8ff693428068b7510bcc7a226a3175af9288824ce5a95fa9a10d9 not found: ID does not exist" Feb 19 20:01:53 crc kubenswrapper[4722]: I0219 20:01:53.096257 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" path="/var/lib/kubelet/pods/9239d54c-2bd6-4d15-8a5d-c40d2bf369b0/volumes" Feb 19 20:02:05 crc kubenswrapper[4722]: I0219 20:02:05.072074 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:02:05 crc kubenswrapper[4722]: E0219 20:02:05.072886 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:02:17 crc kubenswrapper[4722]: I0219 20:02:17.071406 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:02:17 crc kubenswrapper[4722]: E0219 20:02:17.073531 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:02:23 crc kubenswrapper[4722]: I0219 20:02:23.141783 4722 generic.go:334] "Generic (PLEG): container finished" podID="4a2c74da-6ac0-4070-9f5a-577bc5c64771" containerID="10a106f9d091403365a306069c9198e890b02573d70dd878e404c55981b55a77" exitCode=0 Feb 19 20:02:23 crc kubenswrapper[4722]: I0219 20:02:23.141913 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" event={"ID":"4a2c74da-6ac0-4070-9f5a-577bc5c64771","Type":"ContainerDied","Data":"10a106f9d091403365a306069c9198e890b02573d70dd878e404c55981b55a77"} Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.624274 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.821405 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ssh-key-openstack-edpm-ipam\") pod \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.821473 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-telemetry-combined-ca-bundle\") pod \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.821533 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-0\") pod \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.821580 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-inventory\") pod \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.821705 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-1\") pod \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.821831 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-2\") pod \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.821940 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8jjk\" (UniqueName: \"kubernetes.io/projected/4a2c74da-6ac0-4070-9f5a-577bc5c64771-kube-api-access-q8jjk\") pod \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\" (UID: \"4a2c74da-6ac0-4070-9f5a-577bc5c64771\") " Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.827317 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "4a2c74da-6ac0-4070-9f5a-577bc5c64771" (UID: "4a2c74da-6ac0-4070-9f5a-577bc5c64771"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.828617 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2c74da-6ac0-4070-9f5a-577bc5c64771-kube-api-access-q8jjk" (OuterVolumeSpecName: "kube-api-access-q8jjk") pod "4a2c74da-6ac0-4070-9f5a-577bc5c64771" (UID: "4a2c74da-6ac0-4070-9f5a-577bc5c64771"). InnerVolumeSpecName "kube-api-access-q8jjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.855933 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-inventory" (OuterVolumeSpecName: "inventory") pod "4a2c74da-6ac0-4070-9f5a-577bc5c64771" (UID: "4a2c74da-6ac0-4070-9f5a-577bc5c64771"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.857911 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "4a2c74da-6ac0-4070-9f5a-577bc5c64771" (UID: "4a2c74da-6ac0-4070-9f5a-577bc5c64771"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.858668 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "4a2c74da-6ac0-4070-9f5a-577bc5c64771" (UID: "4a2c74da-6ac0-4070-9f5a-577bc5c64771"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.858840 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4a2c74da-6ac0-4070-9f5a-577bc5c64771" (UID: "4a2c74da-6ac0-4070-9f5a-577bc5c64771"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.860943 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "4a2c74da-6ac0-4070-9f5a-577bc5c64771" (UID: "4a2c74da-6ac0-4070-9f5a-577bc5c64771"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.925356 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.925407 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.925422 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8jjk\" (UniqueName: \"kubernetes.io/projected/4a2c74da-6ac0-4070-9f5a-577bc5c64771-kube-api-access-q8jjk\") on node \"crc\" DevicePath \"\"" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.925437 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.925449 4722 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.925461 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 19 20:02:24 crc kubenswrapper[4722]: I0219 20:02:24.925474 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a2c74da-6ac0-4070-9f5a-577bc5c64771-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 20:02:25 crc kubenswrapper[4722]: I0219 20:02:25.163304 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" event={"ID":"4a2c74da-6ac0-4070-9f5a-577bc5c64771","Type":"ContainerDied","Data":"02d6b2604c08a66cc120ea55d8fc798b991254031bbb966ca2783a3b00293769"} Feb 19 20:02:25 crc kubenswrapper[4722]: I0219 20:02:25.163355 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02d6b2604c08a66cc120ea55d8fc798b991254031bbb966ca2783a3b00293769" Feb 19 20:02:25 crc kubenswrapper[4722]: I0219 20:02:25.163375 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp" Feb 19 20:02:31 crc kubenswrapper[4722]: I0219 20:02:31.077931 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:02:31 crc kubenswrapper[4722]: E0219 20:02:31.078760 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:02:45 crc kubenswrapper[4722]: I0219 20:02:45.071811 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:02:45 crc kubenswrapper[4722]: E0219 20:02:45.073362 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:02:57 crc kubenswrapper[4722]: I0219 20:02:57.072087 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:02:57 crc kubenswrapper[4722]: E0219 20:02:57.073619 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:03:09 crc kubenswrapper[4722]: I0219 20:03:09.072200 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:03:09 crc kubenswrapper[4722]: E0219 20:03:09.072904 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:03:24 crc kubenswrapper[4722]: I0219 20:03:24.071401 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:03:24 crc kubenswrapper[4722]: I0219 20:03:24.715098 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"1f7580bf2264179cbb9df05d3f112cd2d55865b3181feb3fa34eefea35e9eac9"} Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.788217 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rtsd9/must-gather-h964l"] Feb 19 20:03:30 crc kubenswrapper[4722]: E0219 20:03:30.788983 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerName="extract-utilities" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.788995 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerName="extract-utilities" Feb 19 20:03:30 crc kubenswrapper[4722]: E0219 20:03:30.789012 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerName="registry-server" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.789019 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerName="registry-server" Feb 19 20:03:30 crc kubenswrapper[4722]: E0219 20:03:30.789033 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerName="extract-content" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.789039 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerName="extract-content" Feb 19 20:03:30 crc kubenswrapper[4722]: E0219 20:03:30.789067 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2c74da-6ac0-4070-9f5a-577bc5c64771" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.789074 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2c74da-6ac0-4070-9f5a-577bc5c64771" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.789258 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9239d54c-2bd6-4d15-8a5d-c40d2bf369b0" containerName="registry-server" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.789280 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2c74da-6ac0-4070-9f5a-577bc5c64771" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.790346 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/must-gather-h964l" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.800288 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rtsd9"/"openshift-service-ca.crt" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.800557 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rtsd9"/"kube-root-ca.crt" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.812753 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rtsd9/must-gather-h964l"] Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.899490 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9n8t\" (UniqueName: \"kubernetes.io/projected/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-kube-api-access-m9n8t\") pod \"must-gather-h964l\" (UID: \"71becbc5-18f8-4f0b-ad6d-a12d9846ac73\") " pod="openshift-must-gather-rtsd9/must-gather-h964l" Feb 19 20:03:30 crc kubenswrapper[4722]: I0219 20:03:30.899604 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-must-gather-output\") pod \"must-gather-h964l\" (UID: \"71becbc5-18f8-4f0b-ad6d-a12d9846ac73\") " pod="openshift-must-gather-rtsd9/must-gather-h964l" Feb 19 20:03:31 crc kubenswrapper[4722]: I0219 20:03:31.001617 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9n8t\" (UniqueName: \"kubernetes.io/projected/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-kube-api-access-m9n8t\") pod \"must-gather-h964l\" (UID: \"71becbc5-18f8-4f0b-ad6d-a12d9846ac73\") " pod="openshift-must-gather-rtsd9/must-gather-h964l" Feb 19 20:03:31 crc kubenswrapper[4722]: I0219 20:03:31.001985 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-must-gather-output\") pod \"must-gather-h964l\" (UID: \"71becbc5-18f8-4f0b-ad6d-a12d9846ac73\") " pod="openshift-must-gather-rtsd9/must-gather-h964l" Feb 19 20:03:31 crc kubenswrapper[4722]: I0219 20:03:31.002470 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-must-gather-output\") pod \"must-gather-h964l\" (UID: \"71becbc5-18f8-4f0b-ad6d-a12d9846ac73\") " pod="openshift-must-gather-rtsd9/must-gather-h964l" Feb 19 20:03:31 crc kubenswrapper[4722]: I0219 20:03:31.026768 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9n8t\" (UniqueName: \"kubernetes.io/projected/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-kube-api-access-m9n8t\") pod \"must-gather-h964l\" (UID: \"71becbc5-18f8-4f0b-ad6d-a12d9846ac73\") " pod="openshift-must-gather-rtsd9/must-gather-h964l" Feb 19 20:03:31 crc kubenswrapper[4722]: I0219 20:03:31.111300 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/must-gather-h964l" Feb 19 20:03:31 crc kubenswrapper[4722]: I0219 20:03:31.853546 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 20:03:31 crc kubenswrapper[4722]: I0219 20:03:31.856553 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rtsd9/must-gather-h964l"] Feb 19 20:03:32 crc kubenswrapper[4722]: I0219 20:03:32.810168 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rtsd9/must-gather-h964l" event={"ID":"71becbc5-18f8-4f0b-ad6d-a12d9846ac73","Type":"ContainerStarted","Data":"d066611f93dcd72357fa176895ee015380e463febd20c32f8384d090af7a454a"} Feb 19 20:03:40 crc kubenswrapper[4722]: I0219 20:03:40.913428 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rtsd9/must-gather-h964l" event={"ID":"71becbc5-18f8-4f0b-ad6d-a12d9846ac73","Type":"ContainerStarted","Data":"6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073"} Feb 19 20:03:40 crc kubenswrapper[4722]: I0219 20:03:40.913969 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rtsd9/must-gather-h964l" event={"ID":"71becbc5-18f8-4f0b-ad6d-a12d9846ac73","Type":"ContainerStarted","Data":"cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103"} Feb 19 20:03:40 crc kubenswrapper[4722]: I0219 20:03:40.947410 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rtsd9/must-gather-h964l" podStartSLOduration=3.134192971 podStartE2EDuration="10.947386588s" podCreationTimestamp="2026-02-19 20:03:30 +0000 UTC" firstStartedPulling="2026-02-19 20:03:31.853500299 +0000 UTC m=+2711.465850623" lastFinishedPulling="2026-02-19 20:03:39.666693896 +0000 UTC m=+2719.279044240" observedRunningTime="2026-02-19 20:03:40.932532906 +0000 UTC m=+2720.544883270" watchObservedRunningTime="2026-02-19 20:03:40.947386588 +0000 UTC m=+2720.559736912" Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.579421 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rtsd9/crc-debug-tbt92"] Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.582686 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/crc-debug-tbt92" Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.584642 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-rtsd9"/"default-dockercfg-k8bvg" Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.660108 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2dsn\" (UniqueName: \"kubernetes.io/projected/5c02eb8b-79a9-47f3-823d-6919493345f2-kube-api-access-z2dsn\") pod \"crc-debug-tbt92\" (UID: \"5c02eb8b-79a9-47f3-823d-6919493345f2\") " pod="openshift-must-gather-rtsd9/crc-debug-tbt92" Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.660382 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c02eb8b-79a9-47f3-823d-6919493345f2-host\") pod \"crc-debug-tbt92\" (UID: \"5c02eb8b-79a9-47f3-823d-6919493345f2\") " pod="openshift-must-gather-rtsd9/crc-debug-tbt92" Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.762666 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2dsn\" (UniqueName: \"kubernetes.io/projected/5c02eb8b-79a9-47f3-823d-6919493345f2-kube-api-access-z2dsn\") pod \"crc-debug-tbt92\" (UID: \"5c02eb8b-79a9-47f3-823d-6919493345f2\") " pod="openshift-must-gather-rtsd9/crc-debug-tbt92" Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.762847 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c02eb8b-79a9-47f3-823d-6919493345f2-host\") pod \"crc-debug-tbt92\" (UID: \"5c02eb8b-79a9-47f3-823d-6919493345f2\") " pod="openshift-must-gather-rtsd9/crc-debug-tbt92" Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.762948 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c02eb8b-79a9-47f3-823d-6919493345f2-host\") pod \"crc-debug-tbt92\" (UID: \"5c02eb8b-79a9-47f3-823d-6919493345f2\") " pod="openshift-must-gather-rtsd9/crc-debug-tbt92" Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.790485 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2dsn\" (UniqueName: \"kubernetes.io/projected/5c02eb8b-79a9-47f3-823d-6919493345f2-kube-api-access-z2dsn\") pod \"crc-debug-tbt92\" (UID: \"5c02eb8b-79a9-47f3-823d-6919493345f2\") " pod="openshift-must-gather-rtsd9/crc-debug-tbt92" Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.904459 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/crc-debug-tbt92" Feb 19 20:03:43 crc kubenswrapper[4722]: I0219 20:03:43.957842 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rtsd9/crc-debug-tbt92" event={"ID":"5c02eb8b-79a9-47f3-823d-6919493345f2","Type":"ContainerStarted","Data":"44c6450afcb051e28021497062327988071c4a584fd190fe0c5ef8dba43c97e3"} Feb 19 20:03:57 crc kubenswrapper[4722]: I0219 20:03:57.122057 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rtsd9/crc-debug-tbt92" event={"ID":"5c02eb8b-79a9-47f3-823d-6919493345f2","Type":"ContainerStarted","Data":"9ed1007c399fbeb98d10bd541b68ef0b058451859bc414aca1e659ef08879eef"} Feb 19 20:03:57 crc kubenswrapper[4722]: I0219 20:03:57.145246 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rtsd9/crc-debug-tbt92" podStartSLOduration=1.351131539 podStartE2EDuration="14.145216082s" podCreationTimestamp="2026-02-19 20:03:43 +0000 UTC" firstStartedPulling="2026-02-19 20:03:43.937434522 +0000 UTC m=+2723.549784846" lastFinishedPulling="2026-02-19 20:03:56.731519075 +0000 UTC m=+2736.343869389" observedRunningTime="2026-02-19 20:03:57.141068132 +0000 UTC m=+2736.753418456" watchObservedRunningTime="2026-02-19 20:03:57.145216082 +0000 UTC m=+2736.757566406" Feb 19 20:04:14 crc kubenswrapper[4722]: I0219 20:04:14.279616 4722 generic.go:334] "Generic (PLEG): container finished" podID="5c02eb8b-79a9-47f3-823d-6919493345f2" containerID="9ed1007c399fbeb98d10bd541b68ef0b058451859bc414aca1e659ef08879eef" exitCode=0 Feb 19 20:04:14 crc kubenswrapper[4722]: I0219 20:04:14.279702 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rtsd9/crc-debug-tbt92" event={"ID":"5c02eb8b-79a9-47f3-823d-6919493345f2","Type":"ContainerDied","Data":"9ed1007c399fbeb98d10bd541b68ef0b058451859bc414aca1e659ef08879eef"} Feb 19 20:04:15 crc kubenswrapper[4722]: I0219 20:04:15.420079 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/crc-debug-tbt92" Feb 19 20:04:15 crc kubenswrapper[4722]: I0219 20:04:15.495620 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rtsd9/crc-debug-tbt92"] Feb 19 20:04:15 crc kubenswrapper[4722]: I0219 20:04:15.511260 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rtsd9/crc-debug-tbt92"] Feb 19 20:04:15 crc kubenswrapper[4722]: I0219 20:04:15.530281 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2dsn\" (UniqueName: \"kubernetes.io/projected/5c02eb8b-79a9-47f3-823d-6919493345f2-kube-api-access-z2dsn\") pod \"5c02eb8b-79a9-47f3-823d-6919493345f2\" (UID: \"5c02eb8b-79a9-47f3-823d-6919493345f2\") " Feb 19 20:04:15 crc kubenswrapper[4722]: I0219 20:04:15.530442 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c02eb8b-79a9-47f3-823d-6919493345f2-host\") pod \"5c02eb8b-79a9-47f3-823d-6919493345f2\" (UID: \"5c02eb8b-79a9-47f3-823d-6919493345f2\") " Feb 19 20:04:15 crc kubenswrapper[4722]: I0219 20:04:15.530526 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c02eb8b-79a9-47f3-823d-6919493345f2-host" (OuterVolumeSpecName: "host") pod "5c02eb8b-79a9-47f3-823d-6919493345f2" (UID: "5c02eb8b-79a9-47f3-823d-6919493345f2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 20:04:15 crc kubenswrapper[4722]: I0219 20:04:15.530986 4722 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5c02eb8b-79a9-47f3-823d-6919493345f2-host\") on node \"crc\" DevicePath \"\"" Feb 19 20:04:15 crc kubenswrapper[4722]: I0219 20:04:15.537377 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c02eb8b-79a9-47f3-823d-6919493345f2-kube-api-access-z2dsn" (OuterVolumeSpecName: "kube-api-access-z2dsn") pod "5c02eb8b-79a9-47f3-823d-6919493345f2" (UID: "5c02eb8b-79a9-47f3-823d-6919493345f2"). InnerVolumeSpecName "kube-api-access-z2dsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:04:15 crc kubenswrapper[4722]: I0219 20:04:15.632682 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2dsn\" (UniqueName: \"kubernetes.io/projected/5c02eb8b-79a9-47f3-823d-6919493345f2-kube-api-access-z2dsn\") on node \"crc\" DevicePath \"\"" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.299034 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44c6450afcb051e28021497062327988071c4a584fd190fe0c5ef8dba43c97e3" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.299112 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/crc-debug-tbt92" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.689761 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rtsd9/crc-debug-lxgjq"] Feb 19 20:04:16 crc kubenswrapper[4722]: E0219 20:04:16.690244 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c02eb8b-79a9-47f3-823d-6919493345f2" containerName="container-00" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.690260 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c02eb8b-79a9-47f3-823d-6919493345f2" containerName="container-00" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.690436 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c02eb8b-79a9-47f3-823d-6919493345f2" containerName="container-00" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.691208 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.693509 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-rtsd9"/"default-dockercfg-k8bvg" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.755546 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed8cf44d-7e13-4277-a920-9fb05b46572a-host\") pod \"crc-debug-lxgjq\" (UID: \"ed8cf44d-7e13-4277-a920-9fb05b46572a\") " pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.755885 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w72qk\" (UniqueName: \"kubernetes.io/projected/ed8cf44d-7e13-4277-a920-9fb05b46572a-kube-api-access-w72qk\") pod \"crc-debug-lxgjq\" (UID: \"ed8cf44d-7e13-4277-a920-9fb05b46572a\") " pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.857444 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w72qk\" (UniqueName: \"kubernetes.io/projected/ed8cf44d-7e13-4277-a920-9fb05b46572a-kube-api-access-w72qk\") pod \"crc-debug-lxgjq\" (UID: \"ed8cf44d-7e13-4277-a920-9fb05b46572a\") " pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.857622 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed8cf44d-7e13-4277-a920-9fb05b46572a-host\") pod \"crc-debug-lxgjq\" (UID: \"ed8cf44d-7e13-4277-a920-9fb05b46572a\") " pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.857912 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed8cf44d-7e13-4277-a920-9fb05b46572a-host\") pod \"crc-debug-lxgjq\" (UID: \"ed8cf44d-7e13-4277-a920-9fb05b46572a\") " pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" Feb 19 20:04:16 crc kubenswrapper[4722]: I0219 20:04:16.879421 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w72qk\" (UniqueName: \"kubernetes.io/projected/ed8cf44d-7e13-4277-a920-9fb05b46572a-kube-api-access-w72qk\") pod \"crc-debug-lxgjq\" (UID: \"ed8cf44d-7e13-4277-a920-9fb05b46572a\") " pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" Feb 19 20:04:17 crc kubenswrapper[4722]: I0219 20:04:17.013799 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" Feb 19 20:04:17 crc kubenswrapper[4722]: I0219 20:04:17.082974 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c02eb8b-79a9-47f3-823d-6919493345f2" path="/var/lib/kubelet/pods/5c02eb8b-79a9-47f3-823d-6919493345f2/volumes" Feb 19 20:04:17 crc kubenswrapper[4722]: I0219 20:04:17.309949 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" event={"ID":"ed8cf44d-7e13-4277-a920-9fb05b46572a","Type":"ContainerStarted","Data":"68b15aadf203a5a3ab8566cc4aa2464283e90597aa995f36daa3b5f112cf187c"} Feb 19 20:04:17 crc kubenswrapper[4722]: I0219 20:04:17.309990 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" event={"ID":"ed8cf44d-7e13-4277-a920-9fb05b46572a","Type":"ContainerStarted","Data":"db660097628c3c55eab2e5d6d408b1ff91c08194bf5ea622ef5b44f7227d6b12"} Feb 19 20:04:17 crc kubenswrapper[4722]: I0219 20:04:17.328830 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" podStartSLOduration=1.328816265 podStartE2EDuration="1.328816265s" podCreationTimestamp="2026-02-19 20:04:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 20:04:17.328448714 +0000 UTC m=+2756.940799038" watchObservedRunningTime="2026-02-19 20:04:17.328816265 +0000 UTC m=+2756.941166589" Feb 19 20:04:18 crc kubenswrapper[4722]: I0219 20:04:18.320514 4722 generic.go:334] "Generic (PLEG): container finished" podID="ed8cf44d-7e13-4277-a920-9fb05b46572a" containerID="68b15aadf203a5a3ab8566cc4aa2464283e90597aa995f36daa3b5f112cf187c" exitCode=1 Feb 19 20:04:18 crc kubenswrapper[4722]: I0219 20:04:18.320540 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" event={"ID":"ed8cf44d-7e13-4277-a920-9fb05b46572a","Type":"ContainerDied","Data":"68b15aadf203a5a3ab8566cc4aa2464283e90597aa995f36daa3b5f112cf187c"} Feb 19 20:04:19 crc kubenswrapper[4722]: I0219 20:04:19.447267 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" Feb 19 20:04:19 crc kubenswrapper[4722]: I0219 20:04:19.479716 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rtsd9/crc-debug-lxgjq"] Feb 19 20:04:19 crc kubenswrapper[4722]: I0219 20:04:19.492263 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rtsd9/crc-debug-lxgjq"] Feb 19 20:04:19 crc kubenswrapper[4722]: I0219 20:04:19.518437 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w72qk\" (UniqueName: \"kubernetes.io/projected/ed8cf44d-7e13-4277-a920-9fb05b46572a-kube-api-access-w72qk\") pod \"ed8cf44d-7e13-4277-a920-9fb05b46572a\" (UID: \"ed8cf44d-7e13-4277-a920-9fb05b46572a\") " Feb 19 20:04:19 crc kubenswrapper[4722]: I0219 20:04:19.518538 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed8cf44d-7e13-4277-a920-9fb05b46572a-host\") pod \"ed8cf44d-7e13-4277-a920-9fb05b46572a\" (UID: \"ed8cf44d-7e13-4277-a920-9fb05b46572a\") " Feb 19 20:04:19 crc kubenswrapper[4722]: I0219 20:04:19.518671 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed8cf44d-7e13-4277-a920-9fb05b46572a-host" (OuterVolumeSpecName: "host") pod "ed8cf44d-7e13-4277-a920-9fb05b46572a" (UID: "ed8cf44d-7e13-4277-a920-9fb05b46572a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 20:04:19 crc kubenswrapper[4722]: I0219 20:04:19.519327 4722 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed8cf44d-7e13-4277-a920-9fb05b46572a-host\") on node \"crc\" DevicePath \"\"" Feb 19 20:04:19 crc kubenswrapper[4722]: I0219 20:04:19.523836 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed8cf44d-7e13-4277-a920-9fb05b46572a-kube-api-access-w72qk" (OuterVolumeSpecName: "kube-api-access-w72qk") pod "ed8cf44d-7e13-4277-a920-9fb05b46572a" (UID: "ed8cf44d-7e13-4277-a920-9fb05b46572a"). InnerVolumeSpecName "kube-api-access-w72qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:04:19 crc kubenswrapper[4722]: I0219 20:04:19.622518 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w72qk\" (UniqueName: \"kubernetes.io/projected/ed8cf44d-7e13-4277-a920-9fb05b46572a-kube-api-access-w72qk\") on node \"crc\" DevicePath \"\"" Feb 19 20:04:20 crc kubenswrapper[4722]: I0219 20:04:20.338887 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db660097628c3c55eab2e5d6d408b1ff91c08194bf5ea622ef5b44f7227d6b12" Feb 19 20:04:20 crc kubenswrapper[4722]: I0219 20:04:20.339023 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/crc-debug-lxgjq" Feb 19 20:04:21 crc kubenswrapper[4722]: I0219 20:04:21.087381 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed8cf44d-7e13-4277-a920-9fb05b46572a" path="/var/lib/kubelet/pods/ed8cf44d-7e13-4277-a920-9fb05b46572a/volumes" Feb 19 20:05:16 crc kubenswrapper[4722]: I0219 20:05:16.006534 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_78e7f1b1-be76-4f05-bd63-ff87b440e173/init-config-reloader/0.log" Feb 19 20:05:16 crc kubenswrapper[4722]: I0219 20:05:16.214716 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_78e7f1b1-be76-4f05-bd63-ff87b440e173/config-reloader/0.log" Feb 19 20:05:16 crc kubenswrapper[4722]: I0219 20:05:16.239861 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_78e7f1b1-be76-4f05-bd63-ff87b440e173/init-config-reloader/0.log" Feb 19 20:05:16 crc kubenswrapper[4722]: I0219 20:05:16.266604 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_78e7f1b1-be76-4f05-bd63-ff87b440e173/alertmanager/0.log" Feb 19 20:05:16 crc kubenswrapper[4722]: I0219 20:05:16.574750 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-546c4d4684-6vk7j_a7701b23-dddb-4a45-8982-11ab69bc30b1/barbican-api/0.log" Feb 19 20:05:16 crc kubenswrapper[4722]: I0219 20:05:16.681496 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-546c4d4684-6vk7j_a7701b23-dddb-4a45-8982-11ab69bc30b1/barbican-api-log/0.log" Feb 19 20:05:16 crc kubenswrapper[4722]: I0219 20:05:16.718287 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-98b54b474-9tfhf_96ffdf9d-f932-419b-be31-9f38358d2db5/barbican-keystone-listener/0.log" Feb 19 20:05:16 crc kubenswrapper[4722]: I0219 20:05:16.799937 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-98b54b474-9tfhf_96ffdf9d-f932-419b-be31-9f38358d2db5/barbican-keystone-listener-log/0.log" Feb 19 20:05:16 crc kubenswrapper[4722]: I0219 20:05:16.977788 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6767bd5ccf-ggbrg_66f5042d-2b30-4ac4-8594-cfc0f9590460/barbican-worker/0.log" Feb 19 20:05:16 crc kubenswrapper[4722]: I0219 20:05:16.992990 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6767bd5ccf-ggbrg_66f5042d-2b30-4ac4-8594-cfc0f9590460/barbican-worker-log/0.log" Feb 19 20:05:17 crc kubenswrapper[4722]: I0219 20:05:17.189104 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-sbfj4_7573aaf8-263a-4e50-84da-58cf311829a9/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:17 crc kubenswrapper[4722]: I0219 20:05:17.293529 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1e7133a0-5642-4b7b-a560-d215b7fd75cd/ceilometer-central-agent/0.log" Feb 19 20:05:17 crc kubenswrapper[4722]: I0219 20:05:17.337783 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1e7133a0-5642-4b7b-a560-d215b7fd75cd/ceilometer-notification-agent/0.log" Feb 19 20:05:17 crc kubenswrapper[4722]: I0219 20:05:17.412387 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1e7133a0-5642-4b7b-a560-d215b7fd75cd/proxy-httpd/0.log" Feb 19 20:05:17 crc kubenswrapper[4722]: I0219 20:05:17.460524 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1e7133a0-5642-4b7b-a560-d215b7fd75cd/sg-core/0.log" Feb 19 20:05:17 crc kubenswrapper[4722]: I0219 20:05:17.638397 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_8c8e6512-8007-4e99-8589-8dccb1975e3f/cinder-api/0.log" Feb 19 20:05:17 crc kubenswrapper[4722]: I0219 20:05:17.650198 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_8c8e6512-8007-4e99-8589-8dccb1975e3f/cinder-api-log/0.log" Feb 19 20:05:17 crc kubenswrapper[4722]: I0219 20:05:17.879848 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_afcc30d0-b94c-4bf7-8736-fb35bc461fa2/cinder-scheduler/0.log" Feb 19 20:05:17 crc kubenswrapper[4722]: I0219 20:05:17.914499 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_afcc30d0-b94c-4bf7-8736-fb35bc461fa2/probe/0.log" Feb 19 20:05:18 crc kubenswrapper[4722]: I0219 20:05:18.019747 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_fc650d44-069f-41ed-b944-f1168dd5b25c/cloudkitty-api/0.log" Feb 19 20:05:18 crc kubenswrapper[4722]: I0219 20:05:18.078796 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_fc650d44-069f-41ed-b944-f1168dd5b25c/cloudkitty-api-log/0.log" Feb 19 20:05:18 crc kubenswrapper[4722]: I0219 20:05:18.178787 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_53bc8f19-43b1-4297-a3db-986381793b6e/loki-compactor/0.log" Feb 19 20:05:18 crc kubenswrapper[4722]: I0219 20:05:18.358500 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-2j29g_47cbe0b4-7d45-486b-9e9b-964db524e7ab/gateway/0.log" Feb 19 20:05:18 crc kubenswrapper[4722]: I0219 20:05:18.377988 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-585d9bcbc-llw6c_aba36975-65f4-4f71-a709-261d2b9255ea/loki-distributor/0.log" Feb 19 20:05:18 crc kubenswrapper[4722]: I0219 20:05:18.550254 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-qxjk2_fc37f35d-ac2f-40a0-90e1-40c3b80b1782/gateway/0.log" Feb 19 20:05:18 crc kubenswrapper[4722]: I0219 20:05:18.695753 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_15869f30-52a4-4db0-aca8-53c5b319f7a1/loki-index-gateway/0.log" Feb 19 20:05:18 crc kubenswrapper[4722]: I0219 20:05:18.788825 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_a3fc19f1-6f9f-4f35-a391-1f6743480bd3/loki-ingester/0.log" Feb 19 20:05:18 crc kubenswrapper[4722]: I0219 20:05:18.915522 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-58c84b5844-k6gcm_cad6276e-0607-49e0-8a90-a11e9b916991/loki-querier/0.log" Feb 19 20:05:19 crc kubenswrapper[4722]: I0219 20:05:19.051742 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-67bb4dfcd8-m6cl8_9babbc99-4133-47c1-85e5-95039351727b/loki-query-frontend/0.log" Feb 19 20:05:19 crc kubenswrapper[4722]: I0219 20:05:19.360783 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-7m66x_7a9a8806-dadf-4cd5-af24-fc35c7e52197/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:19 crc kubenswrapper[4722]: I0219 20:05:19.595011 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-bzc5t_7cf0842e-58ac-4cd1-b26f-9fc131177aa9/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:19 crc kubenswrapper[4722]: I0219 20:05:19.976539 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-8zj5g_f6d970a0-c801-4472-a3b6-eccd8335d0a8/init/0.log" Feb 19 20:05:20 crc kubenswrapper[4722]: I0219 20:05:20.170322 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-8zj5g_f6d970a0-c801-4472-a3b6-eccd8335d0a8/init/0.log" Feb 19 20:05:20 crc kubenswrapper[4722]: I0219 20:05:20.274120 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-85f64749dc-8zj5g_f6d970a0-c801-4472-a3b6-eccd8335d0a8/dnsmasq-dns/0.log" Feb 19 20:05:20 crc kubenswrapper[4722]: I0219 20:05:20.454926 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-6xmg8_23a67d89-596c-44f0-b19d-dc5d1eb3021e/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:20 crc kubenswrapper[4722]: I0219 20:05:20.577088 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_84bb340d-f999-45fc-8e1c-d813e2ad4319/glance-httpd/0.log" Feb 19 20:05:20 crc kubenswrapper[4722]: I0219 20:05:20.590101 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_84bb340d-f999-45fc-8e1c-d813e2ad4319/glance-log/0.log" Feb 19 20:05:20 crc kubenswrapper[4722]: I0219 20:05:20.693189 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_99490f57-22ed-4652-a112-bf45feb67aee/glance-httpd/0.log" Feb 19 20:05:20 crc kubenswrapper[4722]: I0219 20:05:20.853514 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_99490f57-22ed-4652-a112-bf45feb67aee/glance-log/0.log" Feb 19 20:05:20 crc kubenswrapper[4722]: I0219 20:05:20.977749 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-jjfs6_0a95e206-d7b9-49a5-8efd-7cab72e48d9d/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:21 crc kubenswrapper[4722]: I0219 20:05:21.143878 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-zmp82_fa0d4605-cd87-49b1-b17f-8c0e06590afd/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:21 crc kubenswrapper[4722]: I0219 20:05:21.412050 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7cb5f76f4-hx5jh_32b3c2bb-2288-4e2e-a9c6-d19cfe651181/keystone-api/0.log" Feb 19 20:05:21 crc kubenswrapper[4722]: I0219 20:05:21.413390 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29525521-9cbvt_973609f7-b4ce-41f2-ad80-83b1b1593e2f/keystone-cron/0.log" Feb 19 20:05:21 crc kubenswrapper[4722]: I0219 20:05:21.534284 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_f8493c9f-328a-446d-8110-5879a7aedd2b/kube-state-metrics/0.log" Feb 19 20:05:21 crc kubenswrapper[4722]: I0219 20:05:21.690213 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-xbxk2_a0d75723-6d9a-4609-a294-f179d1e84710/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:21 crc kubenswrapper[4722]: I0219 20:05:21.977619 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-8694c7b8f7-2td8g_a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b/neutron-api/0.log" Feb 19 20:05:22 crc kubenswrapper[4722]: I0219 20:05:22.035243 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-8694c7b8f7-2td8g_a6ef8b7c-2bdf-4c6c-a214-2ab66b9d801b/neutron-httpd/0.log" Feb 19 20:05:22 crc kubenswrapper[4722]: I0219 20:05:22.217710 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-7l9zf_ee896205-7724-47fe-9f87-f2efb9afa870/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:22 crc kubenswrapper[4722]: I0219 20:05:22.557358 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5aaacc6a-6882-467d-b66f-0178ccd35955/nova-api-log/0.log" Feb 19 20:05:22 crc kubenswrapper[4722]: I0219 20:05:22.698542 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_5aaacc6a-6882-467d-b66f-0178ccd35955/nova-api-api/0.log" Feb 19 20:05:22 crc kubenswrapper[4722]: I0219 20:05:22.897113 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_69f96c80-f951-453b-9880-ecd0591dc1bf/nova-cell0-conductor-conductor/0.log" Feb 19 20:05:23 crc kubenswrapper[4722]: I0219 20:05:23.024360 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f7880856-0db7-4bbf-9202-04f90868fc1d/nova-cell1-conductor-conductor/0.log" Feb 19 20:05:23 crc kubenswrapper[4722]: I0219 20:05:23.292885 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_168eaa46-c907-452a-8537-3cea6b524360/nova-cell1-novncproxy-novncproxy/0.log" Feb 19 20:05:23 crc kubenswrapper[4722]: I0219 20:05:23.554353 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-cdks2_67f05b1f-f720-4b77-967c-2649fd05cb09/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:23 crc kubenswrapper[4722]: I0219 20:05:23.921776 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5f2a647c-7a68-4e2c-aabf-b18973b20ad0/nova-metadata-log/0.log" Feb 19 20:05:24 crc kubenswrapper[4722]: I0219 20:05:24.274725 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_6f1f5c9a-dacb-45b5-95bf-2e62a12a908b/nova-scheduler-scheduler/0.log" Feb 19 20:05:24 crc kubenswrapper[4722]: I0219 20:05:24.423953 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a07f9633-74f5-48e5-8467-d649fc49a2ff/mysql-bootstrap/0.log" Feb 19 20:05:24 crc kubenswrapper[4722]: I0219 20:05:24.568594 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a07f9633-74f5-48e5-8467-d649fc49a2ff/mysql-bootstrap/0.log" Feb 19 20:05:24 crc kubenswrapper[4722]: I0219 20:05:24.677831 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a07f9633-74f5-48e5-8467-d649fc49a2ff/galera/0.log" Feb 19 20:05:24 crc kubenswrapper[4722]: I0219 20:05:24.698494 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5f2a647c-7a68-4e2c-aabf-b18973b20ad0/nova-metadata-metadata/0.log" Feb 19 20:05:24 crc kubenswrapper[4722]: I0219 20:05:24.912383 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_53444e7f-4c1d-401b-9896-5ff9c4aab65a/mysql-bootstrap/0.log" Feb 19 20:05:25 crc kubenswrapper[4722]: I0219 20:05:25.183383 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_53444e7f-4c1d-401b-9896-5ff9c4aab65a/mysql-bootstrap/0.log" Feb 19 20:05:25 crc kubenswrapper[4722]: I0219 20:05:25.193988 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_53444e7f-4c1d-401b-9896-5ff9c4aab65a/galera/0.log" Feb 19 20:05:25 crc kubenswrapper[4722]: I0219 20:05:25.339826 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_af557f35-ca9e-4990-bdcb-9e44366dab68/openstackclient/0.log" Feb 19 20:05:25 crc kubenswrapper[4722]: I0219 20:05:25.463452 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-6tmmr_293cde43-7bcf-4638-a080-badb26c81138/ovn-controller/0.log" Feb 19 20:05:25 crc kubenswrapper[4722]: I0219 20:05:25.701817 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-282bs_9470e2b8-0f01-4735-8050-1bae363b3a02/openstack-network-exporter/0.log" Feb 19 20:05:25 crc kubenswrapper[4722]: I0219 20:05:25.859844 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fwvrs_c8300e35-4c72-4398-9058-0aa76005d576/ovsdb-server-init/0.log" Feb 19 20:05:26 crc kubenswrapper[4722]: I0219 20:05:26.074823 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fwvrs_c8300e35-4c72-4398-9058-0aa76005d576/ovs-vswitchd/0.log" Feb 19 20:05:26 crc kubenswrapper[4722]: I0219 20:05:26.112681 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fwvrs_c8300e35-4c72-4398-9058-0aa76005d576/ovsdb-server/0.log" Feb 19 20:05:26 crc kubenswrapper[4722]: I0219 20:05:26.121459 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-fwvrs_c8300e35-4c72-4398-9058-0aa76005d576/ovsdb-server-init/0.log" Feb 19 20:05:26 crc kubenswrapper[4722]: I0219 20:05:26.365983 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-v5f89_9ff9829f-e8f9-4d78-9826-0385817cf2a4/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:26 crc kubenswrapper[4722]: I0219 20:05:26.542676 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6f8e6f58-f989-41f2-b8cb-c798405cfa33/ovn-northd/0.log" Feb 19 20:05:26 crc kubenswrapper[4722]: I0219 20:05:26.584533 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6f8e6f58-f989-41f2-b8cb-c798405cfa33/openstack-network-exporter/0.log" Feb 19 20:05:26 crc kubenswrapper[4722]: I0219 20:05:26.766730 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_13228713-9349-4241-b1f7-67f9a2c705fa/openstack-network-exporter/0.log" Feb 19 20:05:26 crc kubenswrapper[4722]: I0219 20:05:26.819424 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_13228713-9349-4241-b1f7-67f9a2c705fa/ovsdbserver-nb/0.log" Feb 19 20:05:27 crc kubenswrapper[4722]: I0219 20:05:27.184492 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_05a27e5a-189e-4d17-9823-d95ef7906a7b/openstack-network-exporter/0.log" Feb 19 20:05:27 crc kubenswrapper[4722]: I0219 20:05:27.296988 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_05a27e5a-189e-4d17-9823-d95ef7906a7b/ovsdbserver-sb/0.log" Feb 19 20:05:27 crc kubenswrapper[4722]: I0219 20:05:27.410619 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7cc7c8879d-tnbfs_41b669ab-d733-4941-b134-b9ad19b38143/placement-api/0.log" Feb 19 20:05:27 crc kubenswrapper[4722]: I0219 20:05:27.499905 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7cc7c8879d-tnbfs_41b669ab-d733-4941-b134-b9ad19b38143/placement-log/0.log" Feb 19 20:05:27 crc kubenswrapper[4722]: I0219 20:05:27.660557 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e3f1f109-9754-4525-b5e8-dbf86ba52f2b/init-config-reloader/0.log" Feb 19 20:05:27 crc kubenswrapper[4722]: I0219 20:05:27.833109 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e3f1f109-9754-4525-b5e8-dbf86ba52f2b/init-config-reloader/0.log" Feb 19 20:05:27 crc kubenswrapper[4722]: I0219 20:05:27.866786 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e3f1f109-9754-4525-b5e8-dbf86ba52f2b/config-reloader/0.log" Feb 19 20:05:27 crc kubenswrapper[4722]: I0219 20:05:27.916089 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e3f1f109-9754-4525-b5e8-dbf86ba52f2b/prometheus/0.log" Feb 19 20:05:28 crc kubenswrapper[4722]: I0219 20:05:28.149059 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_e3f1f109-9754-4525-b5e8-dbf86ba52f2b/thanos-sidecar/0.log" Feb 19 20:05:28 crc kubenswrapper[4722]: I0219 20:05:28.181077 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9ac0e00c-0e1d-40fa-802d-8a77ac4c842b/setup-container/0.log" Feb 19 20:05:28 crc kubenswrapper[4722]: I0219 20:05:28.407796 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9ac0e00c-0e1d-40fa-802d-8a77ac4c842b/setup-container/0.log" Feb 19 20:05:28 crc kubenswrapper[4722]: I0219 20:05:28.449968 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_9ac0e00c-0e1d-40fa-802d-8a77ac4c842b/rabbitmq/0.log" Feb 19 20:05:28 crc kubenswrapper[4722]: I0219 20:05:28.646512 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9f14785b-2e99-4110-9523-78ec32490e71/setup-container/0.log" Feb 19 20:05:28 crc kubenswrapper[4722]: I0219 20:05:28.847927 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9f14785b-2e99-4110-9523-78ec32490e71/setup-container/0.log" Feb 19 20:05:28 crc kubenswrapper[4722]: I0219 20:05:28.892765 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9f14785b-2e99-4110-9523-78ec32490e71/rabbitmq/0.log" Feb 19 20:05:29 crc kubenswrapper[4722]: I0219 20:05:29.062042 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-2j2d6_baff33d3-a587-4283-a861-38d88a47539e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:29 crc kubenswrapper[4722]: I0219 20:05:29.210242 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-2hf52_d2554051-f8a8-413e-b352-13ac8f88da63/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:29 crc kubenswrapper[4722]: I0219 20:05:29.425831 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-ffmwx_78d0d06a-2199-4c5c-99e9-5bf916d8f30e/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:29 crc kubenswrapper[4722]: I0219 20:05:29.531098 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-r6c9k_44ab5cbe-e4cd-4036-8768-104fcf0d8963/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:29 crc kubenswrapper[4722]: I0219 20:05:29.766988 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-rr66z_812efe23-7ca7-49b9-bd76-194a82c603b3/ssh-known-hosts-edpm-deployment/0.log" Feb 19 20:05:30 crc kubenswrapper[4722]: I0219 20:05:30.014012 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-b7b95d7bc-zqb9x_42a3f824-28fe-4734-8ada-a74ffb9930a8/proxy-server/0.log" Feb 19 20:05:30 crc kubenswrapper[4722]: I0219 20:05:30.028555 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-b7b95d7bc-zqb9x_42a3f824-28fe-4734-8ada-a74ffb9930a8/proxy-httpd/0.log" Feb 19 20:05:30 crc kubenswrapper[4722]: I0219 20:05:30.254668 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-q5fhk_c81edb08-7ac8-4cfc-abce-5895b8e7b59b/swift-ring-rebalance/0.log" Feb 19 20:05:30 crc kubenswrapper[4722]: I0219 20:05:30.386716 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/account-auditor/0.log" Feb 19 20:05:30 crc kubenswrapper[4722]: I0219 20:05:30.478599 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/account-reaper/0.log" Feb 19 20:05:30 crc kubenswrapper[4722]: I0219 20:05:30.548657 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/account-replicator/0.log" Feb 19 20:05:30 crc kubenswrapper[4722]: I0219 20:05:30.704337 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/account-server/0.log" Feb 19 20:05:31 crc kubenswrapper[4722]: I0219 20:05:31.016237 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/container-auditor/0.log" Feb 19 20:05:31 crc kubenswrapper[4722]: I0219 20:05:31.128324 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/container-replicator/0.log" Feb 19 20:05:31 crc kubenswrapper[4722]: I0219 20:05:31.212843 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/container-server/0.log" Feb 19 20:05:31 crc kubenswrapper[4722]: I0219 20:05:31.305519 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/container-updater/0.log" Feb 19 20:05:31 crc kubenswrapper[4722]: I0219 20:05:31.493013 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/object-expirer/0.log" Feb 19 20:05:31 crc kubenswrapper[4722]: I0219 20:05:31.494463 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/object-auditor/0.log" Feb 19 20:05:31 crc kubenswrapper[4722]: I0219 20:05:31.628491 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/object-replicator/0.log" Feb 19 20:05:31 crc kubenswrapper[4722]: I0219 20:05:31.752644 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/object-server/0.log" Feb 19 20:05:31 crc kubenswrapper[4722]: I0219 20:05:31.831271 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/object-updater/0.log" Feb 19 20:05:31 crc kubenswrapper[4722]: I0219 20:05:31.923787 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/rsync/0.log" Feb 19 20:05:32 crc kubenswrapper[4722]: I0219 20:05:32.027065 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_98dc74a5-9538-49e4-9dd0-eb2735f18d41/swift-recon-cron/0.log" Feb 19 20:05:32 crc kubenswrapper[4722]: I0219 20:05:32.361947 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-dvsnp_4a2c74da-6ac0-4070-9f5a-577bc5c64771/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:32 crc kubenswrapper[4722]: I0219 20:05:32.441955 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-z5x8v_b51489f6-90e0-4a0d-ae54-24eb1e6f5568/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 20:05:34 crc kubenswrapper[4722]: I0219 20:05:34.477859 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_0e52a6ab-57a4-4fd1-bd50-1832e756fc7f/cloudkitty-proc/0.log" Feb 19 20:05:39 crc kubenswrapper[4722]: I0219 20:05:39.456905 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_059950bd-4e60-42e6-a9c6-4e4ab0b039aa/memcached/0.log" Feb 19 20:05:41 crc kubenswrapper[4722]: I0219 20:05:41.798642 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:05:41 crc kubenswrapper[4722]: I0219 20:05:41.799332 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:05:52 crc kubenswrapper[4722]: I0219 20:05:52.969401 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lv7l8"] Feb 19 20:05:52 crc kubenswrapper[4722]: E0219 20:05:52.970389 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8cf44d-7e13-4277-a920-9fb05b46572a" containerName="container-00" Feb 19 20:05:52 crc kubenswrapper[4722]: I0219 20:05:52.970405 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8cf44d-7e13-4277-a920-9fb05b46572a" containerName="container-00" Feb 19 20:05:52 crc kubenswrapper[4722]: I0219 20:05:52.970628 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed8cf44d-7e13-4277-a920-9fb05b46572a" containerName="container-00" Feb 19 20:05:52 crc kubenswrapper[4722]: I0219 20:05:52.972078 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:52 crc kubenswrapper[4722]: I0219 20:05:52.978592 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lv7l8"] Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.115238 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-catalog-content\") pod \"community-operators-lv7l8\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.115422 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmmkw\" (UniqueName: \"kubernetes.io/projected/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-kube-api-access-hmmkw\") pod \"community-operators-lv7l8\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.115475 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-utilities\") pod \"community-operators-lv7l8\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.216857 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-utilities\") pod \"community-operators-lv7l8\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.217025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-catalog-content\") pod \"community-operators-lv7l8\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.217264 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmmkw\" (UniqueName: \"kubernetes.io/projected/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-kube-api-access-hmmkw\") pod \"community-operators-lv7l8\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.217467 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-utilities\") pod \"community-operators-lv7l8\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.217750 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-catalog-content\") pod \"community-operators-lv7l8\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.238682 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmmkw\" (UniqueName: \"kubernetes.io/projected/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-kube-api-access-hmmkw\") pod \"community-operators-lv7l8\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.326401 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:05:53 crc kubenswrapper[4722]: I0219 20:05:53.917936 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lv7l8"] Feb 19 20:05:54 crc kubenswrapper[4722]: I0219 20:05:54.210810 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lv7l8" event={"ID":"9af17da0-e01d-43d5-9a7d-d97b1ff552c6","Type":"ContainerStarted","Data":"59b24e675f4ce76df37b7ccff17c4da3caeb28c453bb9e4a50d0f073f009b3ef"} Feb 19 20:05:54 crc kubenswrapper[4722]: I0219 20:05:54.211073 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lv7l8" event={"ID":"9af17da0-e01d-43d5-9a7d-d97b1ff552c6","Type":"ContainerStarted","Data":"00eb92e2f6c141e6b8c8c232efe8dff7c051cec833dd70ef1159394157c42a2e"} Feb 19 20:05:55 crc kubenswrapper[4722]: I0219 20:05:55.220956 4722 generic.go:334] "Generic (PLEG): container finished" podID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerID="59b24e675f4ce76df37b7ccff17c4da3caeb28c453bb9e4a50d0f073f009b3ef" exitCode=0 Feb 19 20:05:55 crc kubenswrapper[4722]: I0219 20:05:55.221036 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lv7l8" event={"ID":"9af17da0-e01d-43d5-9a7d-d97b1ff552c6","Type":"ContainerDied","Data":"59b24e675f4ce76df37b7ccff17c4da3caeb28c453bb9e4a50d0f073f009b3ef"} Feb 19 20:05:55 crc kubenswrapper[4722]: I0219 20:05:55.221391 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lv7l8" event={"ID":"9af17da0-e01d-43d5-9a7d-d97b1ff552c6","Type":"ContainerStarted","Data":"ebac0f3d8383994f453ee7ba243e2511ad740c3f2f894226e7061136874f02fb"} Feb 19 20:05:56 crc kubenswrapper[4722]: I0219 20:05:56.233805 4722 generic.go:334] "Generic (PLEG): container finished" podID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerID="ebac0f3d8383994f453ee7ba243e2511ad740c3f2f894226e7061136874f02fb" exitCode=0 Feb 19 20:05:56 crc kubenswrapper[4722]: I0219 20:05:56.234002 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lv7l8" event={"ID":"9af17da0-e01d-43d5-9a7d-d97b1ff552c6","Type":"ContainerDied","Data":"ebac0f3d8383994f453ee7ba243e2511ad740c3f2f894226e7061136874f02fb"} Feb 19 20:05:57 crc kubenswrapper[4722]: I0219 20:05:57.246274 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lv7l8" event={"ID":"9af17da0-e01d-43d5-9a7d-d97b1ff552c6","Type":"ContainerStarted","Data":"9819fdef3375ecdd85bdd03209654849826276cd60f934e8df22d1dced211a29"} Feb 19 20:05:57 crc kubenswrapper[4722]: I0219 20:05:57.270848 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lv7l8" podStartSLOduration=2.621294528 podStartE2EDuration="5.27083348s" podCreationTimestamp="2026-02-19 20:05:52 +0000 UTC" firstStartedPulling="2026-02-19 20:05:54.21591752 +0000 UTC m=+2853.828267844" lastFinishedPulling="2026-02-19 20:05:56.865456472 +0000 UTC m=+2856.477806796" observedRunningTime="2026-02-19 20:05:57.2692181 +0000 UTC m=+2856.881568434" watchObservedRunningTime="2026-02-19 20:05:57.27083348 +0000 UTC m=+2856.883183794" Feb 19 20:06:02 crc kubenswrapper[4722]: I0219 20:06:02.446733 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2_4260359d-1333-4ec5-9a57-16e2782fcf0f/util/0.log" Feb 19 20:06:02 crc kubenswrapper[4722]: I0219 20:06:02.628194 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2_4260359d-1333-4ec5-9a57-16e2782fcf0f/util/0.log" Feb 19 20:06:02 crc kubenswrapper[4722]: I0219 20:06:02.636346 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2_4260359d-1333-4ec5-9a57-16e2782fcf0f/pull/0.log" Feb 19 20:06:02 crc kubenswrapper[4722]: I0219 20:06:02.857513 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2_4260359d-1333-4ec5-9a57-16e2782fcf0f/pull/0.log" Feb 19 20:06:02 crc kubenswrapper[4722]: I0219 20:06:02.987989 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2_4260359d-1333-4ec5-9a57-16e2782fcf0f/pull/0.log" Feb 19 20:06:03 crc kubenswrapper[4722]: I0219 20:06:03.005678 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2_4260359d-1333-4ec5-9a57-16e2782fcf0f/util/0.log" Feb 19 20:06:03 crc kubenswrapper[4722]: I0219 20:06:03.027344 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_533192a6fb53e10c2ef2ce684c936d5ab0d492d89dde7cccee1730ba80pfng2_4260359d-1333-4ec5-9a57-16e2782fcf0f/extract/0.log" Feb 19 20:06:03 crc kubenswrapper[4722]: I0219 20:06:03.327922 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:06:03 crc kubenswrapper[4722]: I0219 20:06:03.328267 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:06:03 crc kubenswrapper[4722]: I0219 20:06:03.382769 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:06:03 crc kubenswrapper[4722]: I0219 20:06:03.580257 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-mc64t_edbe95e5-3a5d-4dec-9a94-509234857155/manager/0.log" Feb 19 20:06:03 crc kubenswrapper[4722]: I0219 20:06:03.947303 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-hxv5g_baba09d1-2238-4ca1-98ee-f44938b68cd3/manager/0.log" Feb 19 20:06:04 crc kubenswrapper[4722]: I0219 20:06:04.175982 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-qrsw8_019f7edd-1d9b-4069-a2a1-36bbe6b0a567/manager/0.log" Feb 19 20:06:04 crc kubenswrapper[4722]: I0219 20:06:04.375832 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:06:04 crc kubenswrapper[4722]: I0219 20:06:04.401909 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-hncxm_2c02c7e1-6f72-44be-a4fb-10ca1df420aa/manager/0.log" Feb 19 20:06:04 crc kubenswrapper[4722]: I0219 20:06:04.439606 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lv7l8"] Feb 19 20:06:04 crc kubenswrapper[4722]: I0219 20:06:04.966981 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-x7bwr_b64009a1-83ef-4d66-bc6b-80ccfc6f7727/manager/0.log" Feb 19 20:06:04 crc kubenswrapper[4722]: I0219 20:06:04.972439 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-rnh9h_c36983b4-b7f9-4834-85e9-a5c3cb83eb2d/manager/0.log" Feb 19 20:06:05 crc kubenswrapper[4722]: I0219 20:06:05.046969 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-q5kgj_421f6539-4fcb-4949-ba29-34997fc98490/manager/0.log" Feb 19 20:06:05 crc kubenswrapper[4722]: I0219 20:06:05.309448 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-x6wk7_db329f91-74f2-4baa-ab5a-85ad999fc8ef/manager/0.log" Feb 19 20:06:05 crc kubenswrapper[4722]: I0219 20:06:05.526747 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-7qkx4_766eebc1-05fc-4ca0-8c75-276632a6597e/manager/0.log" Feb 19 20:06:05 crc kubenswrapper[4722]: I0219 20:06:05.835767 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-8cljg_b37b04c7-5374-49d3-97c0-5b5b27c4a220/manager/0.log" Feb 19 20:06:06 crc kubenswrapper[4722]: I0219 20:06:06.024238 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-6t7g6_57783601-5230-49ef-8ac2-0ddf78bd4b3a/manager/0.log" Feb 19 20:06:06 crc kubenswrapper[4722]: I0219 20:06:06.376529 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lv7l8" podUID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerName="registry-server" containerID="cri-o://9819fdef3375ecdd85bdd03209654849826276cd60f934e8df22d1dced211a29" gracePeriod=2 Feb 19 20:06:06 crc kubenswrapper[4722]: I0219 20:06:06.609928 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-wqp5t_64ff9a64-f79f-4a45-943d-36152964cfcd/manager/0.log" Feb 19 20:06:06 crc kubenswrapper[4722]: I0219 20:06:06.834537 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9csntwh_8870a7b1-f894-4429-9f52-d9063fe9c780/manager/0.log" Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.429814 4722 generic.go:334] "Generic (PLEG): container finished" podID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerID="9819fdef3375ecdd85bdd03209654849826276cd60f934e8df22d1dced211a29" exitCode=0 Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.430068 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lv7l8" event={"ID":"9af17da0-e01d-43d5-9a7d-d97b1ff552c6","Type":"ContainerDied","Data":"9819fdef3375ecdd85bdd03209654849826276cd60f934e8df22d1dced211a29"} Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.488063 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6ddf4746f6-l927q_fb86a4c4-379d-4dcd-86c5-5ee95092e6c0/operator/0.log" Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.555947 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.641506 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-catalog-content\") pod \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.641833 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-utilities\") pod \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.641955 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmmkw\" (UniqueName: \"kubernetes.io/projected/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-kube-api-access-hmmkw\") pod \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\" (UID: \"9af17da0-e01d-43d5-9a7d-d97b1ff552c6\") " Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.643408 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-utilities" (OuterVolumeSpecName: "utilities") pod "9af17da0-e01d-43d5-9a7d-d97b1ff552c6" (UID: "9af17da0-e01d-43d5-9a7d-d97b1ff552c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.649830 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.650809 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-kube-api-access-hmmkw" (OuterVolumeSpecName: "kube-api-access-hmmkw") pod "9af17da0-e01d-43d5-9a7d-d97b1ff552c6" (UID: "9af17da0-e01d-43d5-9a7d-d97b1ff552c6"). InnerVolumeSpecName "kube-api-access-hmmkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.712651 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9af17da0-e01d-43d5-9a7d-d97b1ff552c6" (UID: "9af17da0-e01d-43d5-9a7d-d97b1ff552c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.746350 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-knsfg_efd426b6-a53d-4127-ae59-e2f9aec632cc/registry-server/0.log" Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.757354 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:06:07 crc kubenswrapper[4722]: I0219 20:06:07.757383 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmmkw\" (UniqueName: \"kubernetes.io/projected/9af17da0-e01d-43d5-9a7d-d97b1ff552c6-kube-api-access-hmmkw\") on node \"crc\" DevicePath \"\"" Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.096416 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-6dlqc_738a1346-88e9-4c4e-b7ce-1878736e2493/manager/0.log" Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.308250 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-mgzgq_820eede6-6396-4466-bf00-5d3b39d982d6/manager/0.log" Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.442118 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lv7l8" event={"ID":"9af17da0-e01d-43d5-9a7d-d97b1ff552c6","Type":"ContainerDied","Data":"00eb92e2f6c141e6b8c8c232efe8dff7c051cec833dd70ef1159394157c42a2e"} Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.442200 4722 scope.go:117] "RemoveContainer" containerID="9819fdef3375ecdd85bdd03209654849826276cd60f934e8df22d1dced211a29" Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.442374 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lv7l8" Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.485125 4722 scope.go:117] "RemoveContainer" containerID="ebac0f3d8383994f453ee7ba243e2511ad740c3f2f894226e7061136874f02fb" Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.500110 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lv7l8"] Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.514513 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lv7l8"] Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.569615 4722 scope.go:117] "RemoveContainer" containerID="59b24e675f4ce76df37b7ccff17c4da3caeb28c453bb9e4a50d0f073f009b3ef" Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.594870 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-pjv7d_65b17979-6c94-40e6-ac54-41a61a726e87/operator/0.log" Feb 19 20:06:08 crc kubenswrapper[4722]: I0219 20:06:08.872587 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-wktqn_29ea9b1d-8bbf-4977-a9d6-95a7eb7ee9e8/manager/0.log" Feb 19 20:06:09 crc kubenswrapper[4722]: I0219 20:06:09.020753 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-zft4s_a6fb3554-24ea-4330-b2cb-1c91f105345d/manager/0.log" Feb 19 20:06:09 crc kubenswrapper[4722]: I0219 20:06:09.104943 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" path="/var/lib/kubelet/pods/9af17da0-e01d-43d5-9a7d-d97b1ff552c6/volumes" Feb 19 20:06:09 crc kubenswrapper[4722]: I0219 20:06:09.299295 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-dbdmf_2bfbdb4e-4cb7-4925-8d5d-a2596e6283ac/manager/0.log" Feb 19 20:06:09 crc kubenswrapper[4722]: I0219 20:06:09.707305 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-zdfxj_f3d4c4b3-e6ce-40ef-94d1-6e59efc9c6c0/manager/0.log" Feb 19 20:06:09 crc kubenswrapper[4722]: I0219 20:06:09.827350 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5484b6858b-7g48c_792a7a0a-a11e-42ce-a99b-e24127e7bbe8/manager/0.log" Feb 19 20:06:09 crc kubenswrapper[4722]: I0219 20:06:09.881676 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5f8cf67456-vwhlj_12f061e0-51af-4ab9-a8a7-26b2775651e1/manager/0.log" Feb 19 20:06:11 crc kubenswrapper[4722]: I0219 20:06:11.797865 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:06:11 crc kubenswrapper[4722]: I0219 20:06:11.798216 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:06:11 crc kubenswrapper[4722]: I0219 20:06:11.958721 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-k5c54_0af2e6ef-277d-4022-b42b-5639b589fef9/manager/0.log" Feb 19 20:06:31 crc kubenswrapper[4722]: I0219 20:06:31.178607 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-r4jmd_41fade82-0d8d-41b2-805e-8a92ffa97cf3/control-plane-machine-set-operator/0.log" Feb 19 20:06:31 crc kubenswrapper[4722]: I0219 20:06:31.356522 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-glfz9_bf8b7b84-382a-410f-8dea-c4f485402a77/kube-rbac-proxy/0.log" Feb 19 20:06:31 crc kubenswrapper[4722]: I0219 20:06:31.376555 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-glfz9_bf8b7b84-382a-410f-8dea-c4f485402a77/machine-api-operator/0.log" Feb 19 20:06:41 crc kubenswrapper[4722]: I0219 20:06:41.798592 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:06:41 crc kubenswrapper[4722]: I0219 20:06:41.799083 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:06:41 crc kubenswrapper[4722]: I0219 20:06:41.799126 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 20:06:41 crc kubenswrapper[4722]: I0219 20:06:41.799954 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1f7580bf2264179cbb9df05d3f112cd2d55865b3181feb3fa34eefea35e9eac9"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 20:06:41 crc kubenswrapper[4722]: I0219 20:06:41.800011 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://1f7580bf2264179cbb9df05d3f112cd2d55865b3181feb3fa34eefea35e9eac9" gracePeriod=600 Feb 19 20:06:42 crc kubenswrapper[4722]: I0219 20:06:42.797244 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="1f7580bf2264179cbb9df05d3f112cd2d55865b3181feb3fa34eefea35e9eac9" exitCode=0 Feb 19 20:06:42 crc kubenswrapper[4722]: I0219 20:06:42.797287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"1f7580bf2264179cbb9df05d3f112cd2d55865b3181feb3fa34eefea35e9eac9"} Feb 19 20:06:42 crc kubenswrapper[4722]: I0219 20:06:42.797512 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def"} Feb 19 20:06:42 crc kubenswrapper[4722]: I0219 20:06:42.797539 4722 scope.go:117] "RemoveContainer" containerID="43d6186a9dedc1492ffb61d99b72112eeaa3c820fef622184e0f3fe69c78f209" Feb 19 20:06:45 crc kubenswrapper[4722]: I0219 20:06:45.450590 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-fz7bp_9545d522-f459-4b98-ac7f-d107189b7497/cert-manager-controller/0.log" Feb 19 20:06:45 crc kubenswrapper[4722]: I0219 20:06:45.644577 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-242s6_b1356eef-86bd-4fbf-beb6-a98cd8bc60b8/cert-manager-cainjector/0.log" Feb 19 20:06:45 crc kubenswrapper[4722]: I0219 20:06:45.755001 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-hzrck_e49e50d8-05f3-42f4-a03a-f3a750e1a134/cert-manager-webhook/0.log" Feb 19 20:07:00 crc kubenswrapper[4722]: I0219 20:07:00.749347 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-nlx9v_ed131fa7-525a-481d-83a9-4fef817dc7ce/nmstate-console-plugin/0.log" Feb 19 20:07:00 crc kubenswrapper[4722]: I0219 20:07:00.952219 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-tvslw_59139bb2-e1ae-4f74-96fe-6ea34d232cd9/nmstate-handler/0.log" Feb 19 20:07:01 crc kubenswrapper[4722]: I0219 20:07:01.048332 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-t5lsr_62ed738c-2401-4b21-b6a8-1bc2c1c009ae/kube-rbac-proxy/0.log" Feb 19 20:07:01 crc kubenswrapper[4722]: I0219 20:07:01.122010 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-t5lsr_62ed738c-2401-4b21-b6a8-1bc2c1c009ae/nmstate-metrics/0.log" Feb 19 20:07:01 crc kubenswrapper[4722]: I0219 20:07:01.199872 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-hclph_296e010f-202c-4c01-836e-be6c48607e5f/nmstate-operator/0.log" Feb 19 20:07:01 crc kubenswrapper[4722]: I0219 20:07:01.337260 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-9jmpv_f9185385-162a-40a7-9563-3c668080b9e9/nmstate-webhook/0.log" Feb 19 20:07:14 crc kubenswrapper[4722]: I0219 20:07:14.919674 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5dddbf65fc-6c7df_9a86c9a2-6e06-48f9-b266-1a47a3bb4fda/kube-rbac-proxy/0.log" Feb 19 20:07:14 crc kubenswrapper[4722]: I0219 20:07:14.946815 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5dddbf65fc-6c7df_9a86c9a2-6e06-48f9-b266-1a47a3bb4fda/manager/0.log" Feb 19 20:07:28 crc kubenswrapper[4722]: I0219 20:07:28.128366 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-v7lzn_572e9436-e389-4b1e-b86f-e13f14f8d3eb/prometheus-operator/0.log" Feb 19 20:07:28 crc kubenswrapper[4722]: I0219 20:07:28.322311 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_1577ee2f-abd8-4e61-9fd1-238960e8bdf6/prometheus-operator-admission-webhook/0.log" Feb 19 20:07:28 crc kubenswrapper[4722]: I0219 20:07:28.329484 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_cc8f56cb-a9d1-4b27-adca-40adf6902cc8/prometheus-operator-admission-webhook/0.log" Feb 19 20:07:28 crc kubenswrapper[4722]: I0219 20:07:28.503905 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-4qpbt_7f659845-54cc-4e5c-892c-a754900c1f39/perses-operator/0.log" Feb 19 20:07:28 crc kubenswrapper[4722]: I0219 20:07:28.520777 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-8xtkk_68e6d18b-f149-46fb-ba46-8fb37d82712a/operator/0.log" Feb 19 20:07:42 crc kubenswrapper[4722]: I0219 20:07:42.704850 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-h9kn7_1a80711d-831e-42ab-a5f8-6272eba9c635/kube-rbac-proxy/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.053815 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-h9kn7_1a80711d-831e-42ab-a5f8-6272eba9c635/controller/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.117856 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-frr-files/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.321510 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-reloader/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.401382 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-metrics/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.406100 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-reloader/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.415680 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-frr-files/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.668475 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-reloader/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.670322 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-frr-files/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.676852 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-metrics/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.733633 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-metrics/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.951165 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-frr-files/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.951256 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-reloader/0.log" Feb 19 20:07:43 crc kubenswrapper[4722]: I0219 20:07:43.985133 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/cp-metrics/0.log" Feb 19 20:07:44 crc kubenswrapper[4722]: I0219 20:07:44.005983 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/controller/0.log" Feb 19 20:07:44 crc kubenswrapper[4722]: I0219 20:07:44.141658 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/frr-metrics/0.log" Feb 19 20:07:44 crc kubenswrapper[4722]: I0219 20:07:44.227534 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/kube-rbac-proxy/0.log" Feb 19 20:07:44 crc kubenswrapper[4722]: I0219 20:07:44.335340 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/kube-rbac-proxy-frr/0.log" Feb 19 20:07:44 crc kubenswrapper[4722]: I0219 20:07:44.371571 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/reloader/0.log" Feb 19 20:07:44 crc kubenswrapper[4722]: I0219 20:07:44.547866 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-8nh6q_505e06e7-65a2-4444-8552-8b96253c87fc/frr-k8s-webhook-server/0.log" Feb 19 20:07:44 crc kubenswrapper[4722]: I0219 20:07:44.705409 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-84788dc4db-d5shx_f41ca32e-24fc-427a-a2bc-76e4d5abba0f/manager/0.log" Feb 19 20:07:44 crc kubenswrapper[4722]: I0219 20:07:44.818014 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-78b8d96b76-5d9t2_02eda63c-5131-407e-bb2e-7ad0adf0e985/webhook-server/0.log" Feb 19 20:07:45 crc kubenswrapper[4722]: I0219 20:07:45.009985 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-nnmrq_d1319426-40ee-40fc-86bf-64cca26d6860/kube-rbac-proxy/0.log" Feb 19 20:07:45 crc kubenswrapper[4722]: I0219 20:07:45.444249 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-nnmrq_d1319426-40ee-40fc-86bf-64cca26d6860/speaker/0.log" Feb 19 20:07:45 crc kubenswrapper[4722]: I0219 20:07:45.536022 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-92pkj_4f25f2fe-8438-431d-9e9d-9efba0109efd/frr/0.log" Feb 19 20:07:57 crc kubenswrapper[4722]: I0219 20:07:57.974594 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2_4f50f1aa-154d-409a-826d-c6c4b3c75559/util/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.181932 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2_4f50f1aa-154d-409a-826d-c6c4b3c75559/util/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.204668 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2_4f50f1aa-154d-409a-826d-c6c4b3c75559/pull/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.216280 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2_4f50f1aa-154d-409a-826d-c6c4b3c75559/pull/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.358678 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2_4f50f1aa-154d-409a-826d-c6c4b3c75559/util/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.376783 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2_4f50f1aa-154d-409a-826d-c6c4b3c75559/extract/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.392394 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651bzbf2_4f50f1aa-154d-409a-826d-c6c4b3c75559/pull/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.538521 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m_ae31c080-c2a8-484e-9d6a-bd55ca4ae533/util/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.738181 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m_ae31c080-c2a8-484e-9d6a-bd55ca4ae533/util/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.739452 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m_ae31c080-c2a8-484e-9d6a-bd55ca4ae533/pull/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.760968 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m_ae31c080-c2a8-484e-9d6a-bd55ca4ae533/pull/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.925286 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m_ae31c080-c2a8-484e-9d6a-bd55ca4ae533/util/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.952946 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m_ae31c080-c2a8-484e-9d6a-bd55ca4ae533/pull/0.log" Feb 19 20:07:58 crc kubenswrapper[4722]: I0219 20:07:58.963671 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08b6d8m_ae31c080-c2a8-484e-9d6a-bd55ca4ae533/extract/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.097989 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb_23190f3b-c7a4-4368-ab62-9d5cbd8ddf72/util/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.269241 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb_23190f3b-c7a4-4368-ab62-9d5cbd8ddf72/pull/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.269412 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb_23190f3b-c7a4-4368-ab62-9d5cbd8ddf72/pull/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.290086 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb_23190f3b-c7a4-4368-ab62-9d5cbd8ddf72/util/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.469712 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb_23190f3b-c7a4-4368-ab62-9d5cbd8ddf72/extract/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.489526 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb_23190f3b-c7a4-4368-ab62-9d5cbd8ddf72/util/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.496873 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213khfbb_23190f3b-c7a4-4368-ab62-9d5cbd8ddf72/pull/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.639999 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vwrjw_7a6ec43d-cefe-40ee-b41e-81dc96b88739/extract-utilities/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.825372 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vwrjw_7a6ec43d-cefe-40ee-b41e-81dc96b88739/extract-content/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.835453 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vwrjw_7a6ec43d-cefe-40ee-b41e-81dc96b88739/extract-utilities/0.log" Feb 19 20:07:59 crc kubenswrapper[4722]: I0219 20:07:59.864643 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vwrjw_7a6ec43d-cefe-40ee-b41e-81dc96b88739/extract-content/0.log" Feb 19 20:08:00 crc kubenswrapper[4722]: I0219 20:08:00.040967 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vwrjw_7a6ec43d-cefe-40ee-b41e-81dc96b88739/extract-utilities/0.log" Feb 19 20:08:00 crc kubenswrapper[4722]: I0219 20:08:00.040978 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vwrjw_7a6ec43d-cefe-40ee-b41e-81dc96b88739/extract-content/0.log" Feb 19 20:08:00 crc kubenswrapper[4722]: I0219 20:08:00.248747 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n2l4s_19cd1ff4-6442-47bc-8c68-679c1c19abce/extract-utilities/0.log" Feb 19 20:08:00 crc kubenswrapper[4722]: I0219 20:08:00.521134 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vwrjw_7a6ec43d-cefe-40ee-b41e-81dc96b88739/registry-server/0.log" Feb 19 20:08:00 crc kubenswrapper[4722]: I0219 20:08:00.542301 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n2l4s_19cd1ff4-6442-47bc-8c68-679c1c19abce/extract-content/0.log" Feb 19 20:08:00 crc kubenswrapper[4722]: I0219 20:08:00.570627 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n2l4s_19cd1ff4-6442-47bc-8c68-679c1c19abce/extract-utilities/0.log" Feb 19 20:08:00 crc kubenswrapper[4722]: I0219 20:08:00.615053 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n2l4s_19cd1ff4-6442-47bc-8c68-679c1c19abce/extract-content/0.log" Feb 19 20:08:00 crc kubenswrapper[4722]: I0219 20:08:00.746555 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n2l4s_19cd1ff4-6442-47bc-8c68-679c1c19abce/extract-utilities/0.log" Feb 19 20:08:00 crc kubenswrapper[4722]: I0219 20:08:00.772421 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n2l4s_19cd1ff4-6442-47bc-8c68-679c1c19abce/extract-content/0.log" Feb 19 20:08:00 crc kubenswrapper[4722]: I0219 20:08:00.968066 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2_9e5779bd-c885-4bc1-8f8d-924b571e2851/util/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.216136 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2_9e5779bd-c885-4bc1-8f8d-924b571e2851/pull/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.277503 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2_9e5779bd-c885-4bc1-8f8d-924b571e2851/pull/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.300615 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2_9e5779bd-c885-4bc1-8f8d-924b571e2851/util/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.390621 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n2l4s_19cd1ff4-6442-47bc-8c68-679c1c19abce/registry-server/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.487476 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2_9e5779bd-c885-4bc1-8f8d-924b571e2851/util/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.505469 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2_9e5779bd-c885-4bc1-8f8d-924b571e2851/extract/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.532748 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323eca7rdq2_9e5779bd-c885-4bc1-8f8d-924b571e2851/pull/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.559610 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-lrwfz_6fb12d29-ac35-4e04-a25d-05b1b2545b81/marketplace-operator/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.723623 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xhpzr_277ec436-8032-4711-8573-5b2eaab8f212/extract-utilities/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.888104 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xhpzr_277ec436-8032-4711-8573-5b2eaab8f212/extract-utilities/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.888117 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xhpzr_277ec436-8032-4711-8573-5b2eaab8f212/extract-content/0.log" Feb 19 20:08:01 crc kubenswrapper[4722]: I0219 20:08:01.895931 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xhpzr_277ec436-8032-4711-8573-5b2eaab8f212/extract-content/0.log" Feb 19 20:08:02 crc kubenswrapper[4722]: I0219 20:08:02.111344 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xhpzr_277ec436-8032-4711-8573-5b2eaab8f212/extract-content/0.log" Feb 19 20:08:02 crc kubenswrapper[4722]: I0219 20:08:02.111373 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xhpzr_277ec436-8032-4711-8573-5b2eaab8f212/extract-utilities/0.log" Feb 19 20:08:02 crc kubenswrapper[4722]: I0219 20:08:02.141723 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tr77s_a704e2d3-bed1-47a6-a2d1-af2c3583e06c/extract-utilities/0.log" Feb 19 20:08:02 crc kubenswrapper[4722]: I0219 20:08:02.248691 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xhpzr_277ec436-8032-4711-8573-5b2eaab8f212/registry-server/0.log" Feb 19 20:08:02 crc kubenswrapper[4722]: I0219 20:08:02.359246 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tr77s_a704e2d3-bed1-47a6-a2d1-af2c3583e06c/extract-content/0.log" Feb 19 20:08:02 crc kubenswrapper[4722]: I0219 20:08:02.360358 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tr77s_a704e2d3-bed1-47a6-a2d1-af2c3583e06c/extract-utilities/0.log" Feb 19 20:08:02 crc kubenswrapper[4722]: I0219 20:08:02.379811 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tr77s_a704e2d3-bed1-47a6-a2d1-af2c3583e06c/extract-content/0.log" Feb 19 20:08:02 crc kubenswrapper[4722]: I0219 20:08:02.536416 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tr77s_a704e2d3-bed1-47a6-a2d1-af2c3583e06c/extract-utilities/0.log" Feb 19 20:08:02 crc kubenswrapper[4722]: I0219 20:08:02.563661 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tr77s_a704e2d3-bed1-47a6-a2d1-af2c3583e06c/extract-content/0.log" Feb 19 20:08:02 crc kubenswrapper[4722]: I0219 20:08:02.994678 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-tr77s_a704e2d3-bed1-47a6-a2d1-af2c3583e06c/registry-server/0.log" Feb 19 20:08:15 crc kubenswrapper[4722]: I0219 20:08:15.051681 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-688bff5cf9-b7fbb_cc8f56cb-a9d1-4b27-adca-40adf6902cc8/prometheus-operator-admission-webhook/0.log" Feb 19 20:08:15 crc kubenswrapper[4722]: I0219 20:08:15.051904 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-688bff5cf9-gkbgq_1577ee2f-abd8-4e61-9fd1-238960e8bdf6/prometheus-operator-admission-webhook/0.log" Feb 19 20:08:15 crc kubenswrapper[4722]: I0219 20:08:15.079676 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-v7lzn_572e9436-e389-4b1e-b86f-e13f14f8d3eb/prometheus-operator/0.log" Feb 19 20:08:15 crc kubenswrapper[4722]: I0219 20:08:15.299771 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-8xtkk_68e6d18b-f149-46fb-ba46-8fb37d82712a/operator/0.log" Feb 19 20:08:15 crc kubenswrapper[4722]: I0219 20:08:15.301523 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-4qpbt_7f659845-54cc-4e5c-892c-a754900c1f39/perses-operator/0.log" Feb 19 20:08:27 crc kubenswrapper[4722]: I0219 20:08:27.897961 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5dddbf65fc-6c7df_9a86c9a2-6e06-48f9-b266-1a47a3bb4fda/manager/0.log" Feb 19 20:08:27 crc kubenswrapper[4722]: I0219 20:08:27.920246 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5dddbf65fc-6c7df_9a86c9a2-6e06-48f9-b266-1a47a3bb4fda/kube-rbac-proxy/0.log" Feb 19 20:09:11 crc kubenswrapper[4722]: I0219 20:09:11.798774 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:09:11 crc kubenswrapper[4722]: I0219 20:09:11.800977 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:09:41 crc kubenswrapper[4722]: I0219 20:09:41.798557 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:09:41 crc kubenswrapper[4722]: I0219 20:09:41.799445 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:10:03 crc kubenswrapper[4722]: I0219 20:10:03.369851 4722 generic.go:334] "Generic (PLEG): container finished" podID="71becbc5-18f8-4f0b-ad6d-a12d9846ac73" containerID="cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103" exitCode=0 Feb 19 20:10:03 crc kubenswrapper[4722]: I0219 20:10:03.369961 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rtsd9/must-gather-h964l" event={"ID":"71becbc5-18f8-4f0b-ad6d-a12d9846ac73","Type":"ContainerDied","Data":"cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103"} Feb 19 20:10:03 crc kubenswrapper[4722]: I0219 20:10:03.371608 4722 scope.go:117] "RemoveContainer" containerID="cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103" Feb 19 20:10:03 crc kubenswrapper[4722]: I0219 20:10:03.470924 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rtsd9_must-gather-h964l_71becbc5-18f8-4f0b-ad6d-a12d9846ac73/gather/0.log" Feb 19 20:10:11 crc kubenswrapper[4722]: I0219 20:10:11.784645 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rtsd9/must-gather-h964l"] Feb 19 20:10:11 crc kubenswrapper[4722]: I0219 20:10:11.785456 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-rtsd9/must-gather-h964l" podUID="71becbc5-18f8-4f0b-ad6d-a12d9846ac73" containerName="copy" containerID="cri-o://6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073" gracePeriod=2 Feb 19 20:10:11 crc kubenswrapper[4722]: I0219 20:10:11.799279 4722 patch_prober.go:28] interesting pod/machine-config-daemon-w8zrl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 20:10:11 crc kubenswrapper[4722]: I0219 20:10:11.799529 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 20:10:11 crc kubenswrapper[4722]: I0219 20:10:11.799579 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" Feb 19 20:10:11 crc kubenswrapper[4722]: I0219 20:10:11.800447 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def"} pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 20:10:11 crc kubenswrapper[4722]: I0219 20:10:11.800515 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" containerName="machine-config-daemon" containerID="cri-o://f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" gracePeriod=600 Feb 19 20:10:11 crc kubenswrapper[4722]: I0219 20:10:11.801251 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rtsd9/must-gather-h964l"] Feb 19 20:10:11 crc kubenswrapper[4722]: E0219 20:10:11.973340 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.351736 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rtsd9_must-gather-h964l_71becbc5-18f8-4f0b-ad6d-a12d9846ac73/copy/0.log" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.352482 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/must-gather-h964l" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.452188 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9n8t\" (UniqueName: \"kubernetes.io/projected/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-kube-api-access-m9n8t\") pod \"71becbc5-18f8-4f0b-ad6d-a12d9846ac73\" (UID: \"71becbc5-18f8-4f0b-ad6d-a12d9846ac73\") " Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.452251 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-must-gather-output\") pod \"71becbc5-18f8-4f0b-ad6d-a12d9846ac73\" (UID: \"71becbc5-18f8-4f0b-ad6d-a12d9846ac73\") " Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.459524 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-kube-api-access-m9n8t" (OuterVolumeSpecName: "kube-api-access-m9n8t") pod "71becbc5-18f8-4f0b-ad6d-a12d9846ac73" (UID: "71becbc5-18f8-4f0b-ad6d-a12d9846ac73"). InnerVolumeSpecName "kube-api-access-m9n8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.474282 4722 generic.go:334] "Generic (PLEG): container finished" podID="b265ff4c-d096-4b39-8032-fe0b84354832" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" exitCode=0 Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.474356 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerDied","Data":"f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def"} Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.474407 4722 scope.go:117] "RemoveContainer" containerID="1f7580bf2264179cbb9df05d3f112cd2d55865b3181feb3fa34eefea35e9eac9" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.475647 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:10:12 crc kubenswrapper[4722]: E0219 20:10:12.476448 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.478272 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rtsd9_must-gather-h964l_71becbc5-18f8-4f0b-ad6d-a12d9846ac73/copy/0.log" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.480692 4722 generic.go:334] "Generic (PLEG): container finished" podID="71becbc5-18f8-4f0b-ad6d-a12d9846ac73" containerID="6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073" exitCode=143 Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.480777 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rtsd9/must-gather-h964l" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.554790 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9n8t\" (UniqueName: \"kubernetes.io/projected/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-kube-api-access-m9n8t\") on node \"crc\" DevicePath \"\"" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.555518 4722 scope.go:117] "RemoveContainer" containerID="6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.608948 4722 scope.go:117] "RemoveContainer" containerID="cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.656588 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "71becbc5-18f8-4f0b-ad6d-a12d9846ac73" (UID: "71becbc5-18f8-4f0b-ad6d-a12d9846ac73"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.658825 4722 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/71becbc5-18f8-4f0b-ad6d-a12d9846ac73-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.718146 4722 scope.go:117] "RemoveContainer" containerID="6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073" Feb 19 20:10:12 crc kubenswrapper[4722]: E0219 20:10:12.718677 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073\": container with ID starting with 6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073 not found: ID does not exist" containerID="6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.718721 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073"} err="failed to get container status \"6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073\": rpc error: code = NotFound desc = could not find container \"6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073\": container with ID starting with 6aca0c51a3c8c9aa1637f9a93d86a877bdb63b3ddcf3e783ca8240f21381e073 not found: ID does not exist" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.718750 4722 scope.go:117] "RemoveContainer" containerID="cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103" Feb 19 20:10:12 crc kubenswrapper[4722]: E0219 20:10:12.719309 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103\": container with ID starting with cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103 not found: ID does not exist" containerID="cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103" Feb 19 20:10:12 crc kubenswrapper[4722]: I0219 20:10:12.719353 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103"} err="failed to get container status \"cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103\": rpc error: code = NotFound desc = could not find container \"cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103\": container with ID starting with cadf57fa6e9896434da74063fb3aa4d50ab858b17d7113f47e2e5e290028d103 not found: ID does not exist" Feb 19 20:10:13 crc kubenswrapper[4722]: I0219 20:10:13.082990 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71becbc5-18f8-4f0b-ad6d-a12d9846ac73" path="/var/lib/kubelet/pods/71becbc5-18f8-4f0b-ad6d-a12d9846ac73/volumes" Feb 19 20:10:24 crc kubenswrapper[4722]: I0219 20:10:24.072082 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:10:24 crc kubenswrapper[4722]: E0219 20:10:24.073568 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:10:32 crc kubenswrapper[4722]: I0219 20:10:32.786727 4722 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-8xtkk container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 20:10:32 crc kubenswrapper[4722]: I0219 20:10:32.793292 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" podUID="68e6d18b-f149-46fb-ba46-8fb37d82712a" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 20:10:32 crc kubenswrapper[4722]: I0219 20:10:32.794767 4722 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-8xtkk container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 20:10:32 crc kubenswrapper[4722]: I0219 20:10:32.794812 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-8xtkk" podUID="68e6d18b-f149-46fb-ba46-8fb37d82712a" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.13:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 20:10:37 crc kubenswrapper[4722]: I0219 20:10:37.071505 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:10:37 crc kubenswrapper[4722]: E0219 20:10:37.072476 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:10:46 crc kubenswrapper[4722]: I0219 20:10:46.526140 4722 scope.go:117] "RemoveContainer" containerID="68b15aadf203a5a3ab8566cc4aa2464283e90597aa995f36daa3b5f112cf187c" Feb 19 20:10:46 crc kubenswrapper[4722]: I0219 20:10:46.565929 4722 scope.go:117] "RemoveContainer" containerID="9ed1007c399fbeb98d10bd541b68ef0b058451859bc414aca1e659ef08879eef" Feb 19 20:10:48 crc kubenswrapper[4722]: I0219 20:10:48.071859 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:10:48 crc kubenswrapper[4722]: E0219 20:10:48.072760 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:11:03 crc kubenswrapper[4722]: I0219 20:11:03.071531 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:11:03 crc kubenswrapper[4722]: E0219 20:11:03.072347 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.124304 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mz2kj"] Feb 19 20:11:05 crc kubenswrapper[4722]: E0219 20:11:05.124874 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerName="extract-content" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.124890 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerName="extract-content" Feb 19 20:11:05 crc kubenswrapper[4722]: E0219 20:11:05.124914 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerName="extract-utilities" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.124923 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerName="extract-utilities" Feb 19 20:11:05 crc kubenswrapper[4722]: E0219 20:11:05.124938 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71becbc5-18f8-4f0b-ad6d-a12d9846ac73" containerName="copy" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.124946 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="71becbc5-18f8-4f0b-ad6d-a12d9846ac73" containerName="copy" Feb 19 20:11:05 crc kubenswrapper[4722]: E0219 20:11:05.124976 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerName="registry-server" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.124984 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerName="registry-server" Feb 19 20:11:05 crc kubenswrapper[4722]: E0219 20:11:05.124999 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71becbc5-18f8-4f0b-ad6d-a12d9846ac73" containerName="gather" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.125009 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="71becbc5-18f8-4f0b-ad6d-a12d9846ac73" containerName="gather" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.125307 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="71becbc5-18f8-4f0b-ad6d-a12d9846ac73" containerName="copy" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.125330 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9af17da0-e01d-43d5-9a7d-d97b1ff552c6" containerName="registry-server" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.125346 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="71becbc5-18f8-4f0b-ad6d-a12d9846ac73" containerName="gather" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.127063 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.137878 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mz2kj"] Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.245924 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq9zq\" (UniqueName: \"kubernetes.io/projected/8098abc5-9bf4-457d-8aff-7f23a653bb59-kube-api-access-nq9zq\") pod \"certified-operators-mz2kj\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.245974 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-catalog-content\") pod \"certified-operators-mz2kj\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.246192 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-utilities\") pod \"certified-operators-mz2kj\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.347896 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-utilities\") pod \"certified-operators-mz2kj\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.348124 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq9zq\" (UniqueName: \"kubernetes.io/projected/8098abc5-9bf4-457d-8aff-7f23a653bb59-kube-api-access-nq9zq\") pod \"certified-operators-mz2kj\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.348167 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-catalog-content\") pod \"certified-operators-mz2kj\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.348557 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-utilities\") pod \"certified-operators-mz2kj\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.348690 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-catalog-content\") pod \"certified-operators-mz2kj\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.371774 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq9zq\" (UniqueName: \"kubernetes.io/projected/8098abc5-9bf4-457d-8aff-7f23a653bb59-kube-api-access-nq9zq\") pod \"certified-operators-mz2kj\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.466466 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:05 crc kubenswrapper[4722]: I0219 20:11:05.990642 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mz2kj"] Feb 19 20:11:05 crc kubenswrapper[4722]: W0219 20:11:05.997669 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8098abc5_9bf4_457d_8aff_7f23a653bb59.slice/crio-e0bba27c9490233bbeb48bc1cfc00bc1c6819cdd03f5c9f0b20b9f6291f8d783 WatchSource:0}: Error finding container e0bba27c9490233bbeb48bc1cfc00bc1c6819cdd03f5c9f0b20b9f6291f8d783: Status 404 returned error can't find the container with id e0bba27c9490233bbeb48bc1cfc00bc1c6819cdd03f5c9f0b20b9f6291f8d783 Feb 19 20:11:06 crc kubenswrapper[4722]: I0219 20:11:06.232674 4722 generic.go:334] "Generic (PLEG): container finished" podID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerID="e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759" exitCode=0 Feb 19 20:11:06 crc kubenswrapper[4722]: I0219 20:11:06.232727 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mz2kj" event={"ID":"8098abc5-9bf4-457d-8aff-7f23a653bb59","Type":"ContainerDied","Data":"e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759"} Feb 19 20:11:06 crc kubenswrapper[4722]: I0219 20:11:06.232757 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mz2kj" event={"ID":"8098abc5-9bf4-457d-8aff-7f23a653bb59","Type":"ContainerStarted","Data":"e0bba27c9490233bbeb48bc1cfc00bc1c6819cdd03f5c9f0b20b9f6291f8d783"} Feb 19 20:11:06 crc kubenswrapper[4722]: I0219 20:11:06.235485 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 20:11:07 crc kubenswrapper[4722]: I0219 20:11:07.241962 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mz2kj" event={"ID":"8098abc5-9bf4-457d-8aff-7f23a653bb59","Type":"ContainerStarted","Data":"6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224"} Feb 19 20:11:07 crc kubenswrapper[4722]: I0219 20:11:07.908410 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zbfjr"] Feb 19 20:11:07 crc kubenswrapper[4722]: I0219 20:11:07.912639 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:07 crc kubenswrapper[4722]: I0219 20:11:07.924817 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbfjr"] Feb 19 20:11:07 crc kubenswrapper[4722]: I0219 20:11:07.996893 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-utilities\") pod \"redhat-marketplace-zbfjr\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:07 crc kubenswrapper[4722]: I0219 20:11:07.997041 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zswt\" (UniqueName: \"kubernetes.io/projected/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-kube-api-access-6zswt\") pod \"redhat-marketplace-zbfjr\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:07 crc kubenswrapper[4722]: I0219 20:11:07.997110 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-catalog-content\") pod \"redhat-marketplace-zbfjr\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:08 crc kubenswrapper[4722]: I0219 20:11:08.099045 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-catalog-content\") pod \"redhat-marketplace-zbfjr\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:08 crc kubenswrapper[4722]: I0219 20:11:08.099969 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-utilities\") pod \"redhat-marketplace-zbfjr\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:08 crc kubenswrapper[4722]: I0219 20:11:08.100489 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zswt\" (UniqueName: \"kubernetes.io/projected/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-kube-api-access-6zswt\") pod \"redhat-marketplace-zbfjr\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:08 crc kubenswrapper[4722]: I0219 20:11:08.100280 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-utilities\") pod \"redhat-marketplace-zbfjr\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:08 crc kubenswrapper[4722]: I0219 20:11:08.099899 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-catalog-content\") pod \"redhat-marketplace-zbfjr\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:08 crc kubenswrapper[4722]: I0219 20:11:08.125582 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zswt\" (UniqueName: \"kubernetes.io/projected/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-kube-api-access-6zswt\") pod \"redhat-marketplace-zbfjr\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:08 crc kubenswrapper[4722]: I0219 20:11:08.247377 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:08 crc kubenswrapper[4722]: W0219 20:11:08.786801 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bce7cdd_fc85_4b5a_a7b3_c2b073897ad0.slice/crio-e51b70afe9074bf31ba964105dc467019ea31bd731cc08abe4a2a39f50aea483 WatchSource:0}: Error finding container e51b70afe9074bf31ba964105dc467019ea31bd731cc08abe4a2a39f50aea483: Status 404 returned error can't find the container with id e51b70afe9074bf31ba964105dc467019ea31bd731cc08abe4a2a39f50aea483 Feb 19 20:11:08 crc kubenswrapper[4722]: I0219 20:11:08.789622 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbfjr"] Feb 19 20:11:09 crc kubenswrapper[4722]: I0219 20:11:09.265969 4722 generic.go:334] "Generic (PLEG): container finished" podID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerID="6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224" exitCode=0 Feb 19 20:11:09 crc kubenswrapper[4722]: I0219 20:11:09.266064 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mz2kj" event={"ID":"8098abc5-9bf4-457d-8aff-7f23a653bb59","Type":"ContainerDied","Data":"6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224"} Feb 19 20:11:09 crc kubenswrapper[4722]: I0219 20:11:09.269432 4722 generic.go:334] "Generic (PLEG): container finished" podID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerID="913a735795c14766116399446b635d98dcb30549bf7fd872ee753ae569a27a4a" exitCode=0 Feb 19 20:11:09 crc kubenswrapper[4722]: I0219 20:11:09.269474 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbfjr" event={"ID":"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0","Type":"ContainerDied","Data":"913a735795c14766116399446b635d98dcb30549bf7fd872ee753ae569a27a4a"} Feb 19 20:11:09 crc kubenswrapper[4722]: I0219 20:11:09.269505 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbfjr" event={"ID":"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0","Type":"ContainerStarted","Data":"e51b70afe9074bf31ba964105dc467019ea31bd731cc08abe4a2a39f50aea483"} Feb 19 20:11:10 crc kubenswrapper[4722]: I0219 20:11:10.282889 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbfjr" event={"ID":"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0","Type":"ContainerStarted","Data":"d010224a2c7a68cae242b9cbda355c630c2cfd1497db87c8703e8a5b31e6a628"} Feb 19 20:11:10 crc kubenswrapper[4722]: I0219 20:11:10.287114 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mz2kj" event={"ID":"8098abc5-9bf4-457d-8aff-7f23a653bb59","Type":"ContainerStarted","Data":"2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828"} Feb 19 20:11:10 crc kubenswrapper[4722]: I0219 20:11:10.323055 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mz2kj" podStartSLOduration=1.853841251 podStartE2EDuration="5.323037393s" podCreationTimestamp="2026-02-19 20:11:05 +0000 UTC" firstStartedPulling="2026-02-19 20:11:06.235250093 +0000 UTC m=+3165.847600417" lastFinishedPulling="2026-02-19 20:11:09.704446225 +0000 UTC m=+3169.316796559" observedRunningTime="2026-02-19 20:11:10.319342047 +0000 UTC m=+3169.931692371" watchObservedRunningTime="2026-02-19 20:11:10.323037393 +0000 UTC m=+3169.935387717" Feb 19 20:11:11 crc kubenswrapper[4722]: I0219 20:11:11.298096 4722 generic.go:334] "Generic (PLEG): container finished" podID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerID="d010224a2c7a68cae242b9cbda355c630c2cfd1497db87c8703e8a5b31e6a628" exitCode=0 Feb 19 20:11:11 crc kubenswrapper[4722]: I0219 20:11:11.298169 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbfjr" event={"ID":"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0","Type":"ContainerDied","Data":"d010224a2c7a68cae242b9cbda355c630c2cfd1497db87c8703e8a5b31e6a628"} Feb 19 20:11:12 crc kubenswrapper[4722]: I0219 20:11:12.309604 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbfjr" event={"ID":"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0","Type":"ContainerStarted","Data":"22330e829ac7a8b9fcc933746900e1030b09639690401e6fd77a72b6f6dad762"} Feb 19 20:11:12 crc kubenswrapper[4722]: I0219 20:11:12.353276 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zbfjr" podStartSLOduration=2.66676097 podStartE2EDuration="5.353241991s" podCreationTimestamp="2026-02-19 20:11:07 +0000 UTC" firstStartedPulling="2026-02-19 20:11:09.271880144 +0000 UTC m=+3168.884230468" lastFinishedPulling="2026-02-19 20:11:11.958361165 +0000 UTC m=+3171.570711489" observedRunningTime="2026-02-19 20:11:12.333366521 +0000 UTC m=+3171.945716845" watchObservedRunningTime="2026-02-19 20:11:12.353241991 +0000 UTC m=+3171.965592355" Feb 19 20:11:15 crc kubenswrapper[4722]: I0219 20:11:15.466777 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:15 crc kubenswrapper[4722]: I0219 20:11:15.467572 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:15 crc kubenswrapper[4722]: I0219 20:11:15.541659 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:16 crc kubenswrapper[4722]: I0219 20:11:16.071845 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:11:16 crc kubenswrapper[4722]: E0219 20:11:16.072347 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:11:16 crc kubenswrapper[4722]: I0219 20:11:16.443480 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:17 crc kubenswrapper[4722]: I0219 20:11:17.498099 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mz2kj"] Feb 19 20:11:18 crc kubenswrapper[4722]: I0219 20:11:18.247834 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:18 crc kubenswrapper[4722]: I0219 20:11:18.247886 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:18 crc kubenswrapper[4722]: I0219 20:11:18.309700 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:18 crc kubenswrapper[4722]: I0219 20:11:18.375548 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mz2kj" podUID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerName="registry-server" containerID="cri-o://2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828" gracePeriod=2 Feb 19 20:11:18 crc kubenswrapper[4722]: I0219 20:11:18.445963 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.014274 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.031003 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-utilities\") pod \"8098abc5-9bf4-457d-8aff-7f23a653bb59\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.031532 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq9zq\" (UniqueName: \"kubernetes.io/projected/8098abc5-9bf4-457d-8aff-7f23a653bb59-kube-api-access-nq9zq\") pod \"8098abc5-9bf4-457d-8aff-7f23a653bb59\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.031754 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-catalog-content\") pod \"8098abc5-9bf4-457d-8aff-7f23a653bb59\" (UID: \"8098abc5-9bf4-457d-8aff-7f23a653bb59\") " Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.031889 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-utilities" (OuterVolumeSpecName: "utilities") pod "8098abc5-9bf4-457d-8aff-7f23a653bb59" (UID: "8098abc5-9bf4-457d-8aff-7f23a653bb59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.032694 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.043660 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8098abc5-9bf4-457d-8aff-7f23a653bb59-kube-api-access-nq9zq" (OuterVolumeSpecName: "kube-api-access-nq9zq") pod "8098abc5-9bf4-457d-8aff-7f23a653bb59" (UID: "8098abc5-9bf4-457d-8aff-7f23a653bb59"). InnerVolumeSpecName "kube-api-access-nq9zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.096710 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8098abc5-9bf4-457d-8aff-7f23a653bb59" (UID: "8098abc5-9bf4-457d-8aff-7f23a653bb59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.135334 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8098abc5-9bf4-457d-8aff-7f23a653bb59-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.135365 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq9zq\" (UniqueName: \"kubernetes.io/projected/8098abc5-9bf4-457d-8aff-7f23a653bb59-kube-api-access-nq9zq\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.389787 4722 generic.go:334] "Generic (PLEG): container finished" podID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerID="2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828" exitCode=0 Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.389837 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mz2kj" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.389900 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mz2kj" event={"ID":"8098abc5-9bf4-457d-8aff-7f23a653bb59","Type":"ContainerDied","Data":"2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828"} Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.389969 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mz2kj" event={"ID":"8098abc5-9bf4-457d-8aff-7f23a653bb59","Type":"ContainerDied","Data":"e0bba27c9490233bbeb48bc1cfc00bc1c6819cdd03f5c9f0b20b9f6291f8d783"} Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.389998 4722 scope.go:117] "RemoveContainer" containerID="2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.412024 4722 scope.go:117] "RemoveContainer" containerID="6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.438963 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mz2kj"] Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.453981 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mz2kj"] Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.456611 4722 scope.go:117] "RemoveContainer" containerID="e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.506169 4722 scope.go:117] "RemoveContainer" containerID="2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828" Feb 19 20:11:19 crc kubenswrapper[4722]: E0219 20:11:19.506659 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828\": container with ID starting with 2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828 not found: ID does not exist" containerID="2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.506700 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828"} err="failed to get container status \"2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828\": rpc error: code = NotFound desc = could not find container \"2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828\": container with ID starting with 2ffb98c12a6528253f167e8e36ae0cbe913b7858cc48232ccfa0d23c73f67828 not found: ID does not exist" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.506725 4722 scope.go:117] "RemoveContainer" containerID="6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224" Feb 19 20:11:19 crc kubenswrapper[4722]: E0219 20:11:19.507208 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224\": container with ID starting with 6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224 not found: ID does not exist" containerID="6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.507260 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224"} err="failed to get container status \"6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224\": rpc error: code = NotFound desc = could not find container \"6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224\": container with ID starting with 6f8676b39b0bd218762241a3ca908aeb53269ca719192ca3d8e5c902d03df224 not found: ID does not exist" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.507295 4722 scope.go:117] "RemoveContainer" containerID="e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759" Feb 19 20:11:19 crc kubenswrapper[4722]: E0219 20:11:19.507623 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759\": container with ID starting with e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759 not found: ID does not exist" containerID="e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759" Feb 19 20:11:19 crc kubenswrapper[4722]: I0219 20:11:19.507653 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759"} err="failed to get container status \"e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759\": rpc error: code = NotFound desc = could not find container \"e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759\": container with ID starting with e420688b89728b19840afdf958810b3bf7988a57bf6f0dead4892207e6105759 not found: ID does not exist" Feb 19 20:11:21 crc kubenswrapper[4722]: I0219 20:11:21.102279 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8098abc5-9bf4-457d-8aff-7f23a653bb59" path="/var/lib/kubelet/pods/8098abc5-9bf4-457d-8aff-7f23a653bb59/volumes" Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.112842 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbfjr"] Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.113582 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zbfjr" podUID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerName="registry-server" containerID="cri-o://22330e829ac7a8b9fcc933746900e1030b09639690401e6fd77a72b6f6dad762" gracePeriod=2 Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.442643 4722 generic.go:334] "Generic (PLEG): container finished" podID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerID="22330e829ac7a8b9fcc933746900e1030b09639690401e6fd77a72b6f6dad762" exitCode=0 Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.442739 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbfjr" event={"ID":"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0","Type":"ContainerDied","Data":"22330e829ac7a8b9fcc933746900e1030b09639690401e6fd77a72b6f6dad762"} Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.619726 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.735690 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zswt\" (UniqueName: \"kubernetes.io/projected/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-kube-api-access-6zswt\") pod \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.736028 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-catalog-content\") pod \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.736115 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-utilities\") pod \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\" (UID: \"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0\") " Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.737550 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-utilities" (OuterVolumeSpecName: "utilities") pod "0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" (UID: "0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.744242 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-kube-api-access-6zswt" (OuterVolumeSpecName: "kube-api-access-6zswt") pod "0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" (UID: "0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0"). InnerVolumeSpecName "kube-api-access-6zswt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.763824 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" (UID: "0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.839405 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zswt\" (UniqueName: \"kubernetes.io/projected/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-kube-api-access-6zswt\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.839785 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:23 crc kubenswrapper[4722]: I0219 20:11:23.839798 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:11:24 crc kubenswrapper[4722]: I0219 20:11:24.458266 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zbfjr" event={"ID":"0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0","Type":"ContainerDied","Data":"e51b70afe9074bf31ba964105dc467019ea31bd731cc08abe4a2a39f50aea483"} Feb 19 20:11:24 crc kubenswrapper[4722]: I0219 20:11:24.458343 4722 scope.go:117] "RemoveContainer" containerID="22330e829ac7a8b9fcc933746900e1030b09639690401e6fd77a72b6f6dad762" Feb 19 20:11:24 crc kubenswrapper[4722]: I0219 20:11:24.458351 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zbfjr" Feb 19 20:11:24 crc kubenswrapper[4722]: I0219 20:11:24.510302 4722 scope.go:117] "RemoveContainer" containerID="d010224a2c7a68cae242b9cbda355c630c2cfd1497db87c8703e8a5b31e6a628" Feb 19 20:11:24 crc kubenswrapper[4722]: I0219 20:11:24.514731 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbfjr"] Feb 19 20:11:24 crc kubenswrapper[4722]: I0219 20:11:24.525850 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zbfjr"] Feb 19 20:11:24 crc kubenswrapper[4722]: I0219 20:11:24.536983 4722 scope.go:117] "RemoveContainer" containerID="913a735795c14766116399446b635d98dcb30549bf7fd872ee753ae569a27a4a" Feb 19 20:11:25 crc kubenswrapper[4722]: I0219 20:11:25.087860 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" path="/var/lib/kubelet/pods/0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0/volumes" Feb 19 20:11:31 crc kubenswrapper[4722]: I0219 20:11:31.077550 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:11:31 crc kubenswrapper[4722]: E0219 20:11:31.078362 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:11:42 crc kubenswrapper[4722]: I0219 20:11:42.071991 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:11:42 crc kubenswrapper[4722]: E0219 20:11:42.072918 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.189888 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7h79q"] Feb 19 20:11:47 crc kubenswrapper[4722]: E0219 20:11:47.191070 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerName="registry-server" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.191082 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerName="registry-server" Feb 19 20:11:47 crc kubenswrapper[4722]: E0219 20:11:47.191093 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerName="extract-content" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.191099 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerName="extract-content" Feb 19 20:11:47 crc kubenswrapper[4722]: E0219 20:11:47.191110 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerName="extract-utilities" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.191116 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerName="extract-utilities" Feb 19 20:11:47 crc kubenswrapper[4722]: E0219 20:11:47.191131 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerName="extract-utilities" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.191137 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerName="extract-utilities" Feb 19 20:11:47 crc kubenswrapper[4722]: E0219 20:11:47.191185 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerName="extract-content" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.191192 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerName="extract-content" Feb 19 20:11:47 crc kubenswrapper[4722]: E0219 20:11:47.191207 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerName="registry-server" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.191213 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerName="registry-server" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.191400 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8098abc5-9bf4-457d-8aff-7f23a653bb59" containerName="registry-server" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.191414 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bce7cdd-fc85-4b5a-a7b3-c2b073897ad0" containerName="registry-server" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.192962 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.204312 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7h79q"] Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.338960 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-utilities\") pod \"redhat-operators-7h79q\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.339027 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcjpv\" (UniqueName: \"kubernetes.io/projected/9ab54680-8998-4ed7-aa56-8196f18629c5-kube-api-access-rcjpv\") pod \"redhat-operators-7h79q\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.339279 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-catalog-content\") pod \"redhat-operators-7h79q\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.441326 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-utilities\") pod \"redhat-operators-7h79q\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.441399 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcjpv\" (UniqueName: \"kubernetes.io/projected/9ab54680-8998-4ed7-aa56-8196f18629c5-kube-api-access-rcjpv\") pod \"redhat-operators-7h79q\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.441465 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-catalog-content\") pod \"redhat-operators-7h79q\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.441921 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-utilities\") pod \"redhat-operators-7h79q\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.441967 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-catalog-content\") pod \"redhat-operators-7h79q\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.467786 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcjpv\" (UniqueName: \"kubernetes.io/projected/9ab54680-8998-4ed7-aa56-8196f18629c5-kube-api-access-rcjpv\") pod \"redhat-operators-7h79q\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.523442 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:47 crc kubenswrapper[4722]: I0219 20:11:47.991916 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7h79q"] Feb 19 20:11:48 crc kubenswrapper[4722]: I0219 20:11:48.732343 4722 generic.go:334] "Generic (PLEG): container finished" podID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerID="217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8" exitCode=0 Feb 19 20:11:48 crc kubenswrapper[4722]: I0219 20:11:48.732393 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h79q" event={"ID":"9ab54680-8998-4ed7-aa56-8196f18629c5","Type":"ContainerDied","Data":"217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8"} Feb 19 20:11:48 crc kubenswrapper[4722]: I0219 20:11:48.732608 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h79q" event={"ID":"9ab54680-8998-4ed7-aa56-8196f18629c5","Type":"ContainerStarted","Data":"56b15ecd82052d49803a12a739aa9fc81dc3ed1ffdb9822b4047322c5da5eefe"} Feb 19 20:11:49 crc kubenswrapper[4722]: I0219 20:11:49.745072 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h79q" event={"ID":"9ab54680-8998-4ed7-aa56-8196f18629c5","Type":"ContainerStarted","Data":"64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1"} Feb 19 20:11:53 crc kubenswrapper[4722]: I0219 20:11:53.790976 4722 generic.go:334] "Generic (PLEG): container finished" podID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerID="64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1" exitCode=0 Feb 19 20:11:53 crc kubenswrapper[4722]: I0219 20:11:53.791073 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h79q" event={"ID":"9ab54680-8998-4ed7-aa56-8196f18629c5","Type":"ContainerDied","Data":"64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1"} Feb 19 20:11:54 crc kubenswrapper[4722]: I0219 20:11:54.072350 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:11:54 crc kubenswrapper[4722]: E0219 20:11:54.072900 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:11:54 crc kubenswrapper[4722]: I0219 20:11:54.808420 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h79q" event={"ID":"9ab54680-8998-4ed7-aa56-8196f18629c5","Type":"ContainerStarted","Data":"1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a"} Feb 19 20:11:54 crc kubenswrapper[4722]: I0219 20:11:54.832495 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7h79q" podStartSLOduration=2.365717994 podStartE2EDuration="7.832476727s" podCreationTimestamp="2026-02-19 20:11:47 +0000 UTC" firstStartedPulling="2026-02-19 20:11:48.734230195 +0000 UTC m=+3208.346580559" lastFinishedPulling="2026-02-19 20:11:54.200988958 +0000 UTC m=+3213.813339292" observedRunningTime="2026-02-19 20:11:54.827380549 +0000 UTC m=+3214.439730933" watchObservedRunningTime="2026-02-19 20:11:54.832476727 +0000 UTC m=+3214.444827051" Feb 19 20:11:57 crc kubenswrapper[4722]: I0219 20:11:57.525070 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:57 crc kubenswrapper[4722]: I0219 20:11:57.525453 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:11:58 crc kubenswrapper[4722]: I0219 20:11:58.596251 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7h79q" podUID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerName="registry-server" probeResult="failure" output=< Feb 19 20:11:58 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 19 20:11:58 crc kubenswrapper[4722]: > Feb 19 20:12:07 crc kubenswrapper[4722]: I0219 20:12:07.590562 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:12:07 crc kubenswrapper[4722]: I0219 20:12:07.652245 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:12:07 crc kubenswrapper[4722]: I0219 20:12:07.829986 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7h79q"] Feb 19 20:12:08 crc kubenswrapper[4722]: I0219 20:12:08.071878 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:12:08 crc kubenswrapper[4722]: E0219 20:12:08.072215 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:12:08 crc kubenswrapper[4722]: I0219 20:12:08.955129 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7h79q" podUID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerName="registry-server" containerID="cri-o://1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a" gracePeriod=2 Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.527864 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.607129 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-catalog-content\") pod \"9ab54680-8998-4ed7-aa56-8196f18629c5\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.607470 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcjpv\" (UniqueName: \"kubernetes.io/projected/9ab54680-8998-4ed7-aa56-8196f18629c5-kube-api-access-rcjpv\") pod \"9ab54680-8998-4ed7-aa56-8196f18629c5\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.607656 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-utilities\") pod \"9ab54680-8998-4ed7-aa56-8196f18629c5\" (UID: \"9ab54680-8998-4ed7-aa56-8196f18629c5\") " Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.608851 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-utilities" (OuterVolumeSpecName: "utilities") pod "9ab54680-8998-4ed7-aa56-8196f18629c5" (UID: "9ab54680-8998-4ed7-aa56-8196f18629c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.611639 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.620317 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ab54680-8998-4ed7-aa56-8196f18629c5-kube-api-access-rcjpv" (OuterVolumeSpecName: "kube-api-access-rcjpv") pod "9ab54680-8998-4ed7-aa56-8196f18629c5" (UID: "9ab54680-8998-4ed7-aa56-8196f18629c5"). InnerVolumeSpecName "kube-api-access-rcjpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.713594 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcjpv\" (UniqueName: \"kubernetes.io/projected/9ab54680-8998-4ed7-aa56-8196f18629c5-kube-api-access-rcjpv\") on node \"crc\" DevicePath \"\"" Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.749832 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ab54680-8998-4ed7-aa56-8196f18629c5" (UID: "9ab54680-8998-4ed7-aa56-8196f18629c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.816206 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ab54680-8998-4ed7-aa56-8196f18629c5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.968795 4722 generic.go:334] "Generic (PLEG): container finished" podID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerID="1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a" exitCode=0 Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.968863 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h79q" event={"ID":"9ab54680-8998-4ed7-aa56-8196f18629c5","Type":"ContainerDied","Data":"1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a"} Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.969598 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h79q" event={"ID":"9ab54680-8998-4ed7-aa56-8196f18629c5","Type":"ContainerDied","Data":"56b15ecd82052d49803a12a739aa9fc81dc3ed1ffdb9822b4047322c5da5eefe"} Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.968869 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7h79q" Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.969639 4722 scope.go:117] "RemoveContainer" containerID="1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a" Feb 19 20:12:09 crc kubenswrapper[4722]: I0219 20:12:09.989270 4722 scope.go:117] "RemoveContainer" containerID="64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1" Feb 19 20:12:10 crc kubenswrapper[4722]: I0219 20:12:10.012569 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7h79q"] Feb 19 20:12:10 crc kubenswrapper[4722]: I0219 20:12:10.025902 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7h79q"] Feb 19 20:12:10 crc kubenswrapper[4722]: I0219 20:12:10.037875 4722 scope.go:117] "RemoveContainer" containerID="217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8" Feb 19 20:12:10 crc kubenswrapper[4722]: I0219 20:12:10.061756 4722 scope.go:117] "RemoveContainer" containerID="1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a" Feb 19 20:12:10 crc kubenswrapper[4722]: E0219 20:12:10.062266 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a\": container with ID starting with 1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a not found: ID does not exist" containerID="1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a" Feb 19 20:12:10 crc kubenswrapper[4722]: I0219 20:12:10.062310 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a"} err="failed to get container status \"1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a\": rpc error: code = NotFound desc = could not find container \"1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a\": container with ID starting with 1e4bda67dd079d820a4f178075df1aa20cc45bcc237b28fd26696a3f95f0ab4a not found: ID does not exist" Feb 19 20:12:10 crc kubenswrapper[4722]: I0219 20:12:10.062335 4722 scope.go:117] "RemoveContainer" containerID="64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1" Feb 19 20:12:10 crc kubenswrapper[4722]: E0219 20:12:10.062653 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1\": container with ID starting with 64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1 not found: ID does not exist" containerID="64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1" Feb 19 20:12:10 crc kubenswrapper[4722]: I0219 20:12:10.062702 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1"} err="failed to get container status \"64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1\": rpc error: code = NotFound desc = could not find container \"64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1\": container with ID starting with 64bb61012b64cdaff95f1c697b5839c587e80679d016db9a8d10588101698fb1 not found: ID does not exist" Feb 19 20:12:10 crc kubenswrapper[4722]: I0219 20:12:10.062734 4722 scope.go:117] "RemoveContainer" containerID="217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8" Feb 19 20:12:10 crc kubenswrapper[4722]: E0219 20:12:10.063102 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8\": container with ID starting with 217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8 not found: ID does not exist" containerID="217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8" Feb 19 20:12:10 crc kubenswrapper[4722]: I0219 20:12:10.063179 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8"} err="failed to get container status \"217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8\": rpc error: code = NotFound desc = could not find container \"217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8\": container with ID starting with 217cacabc811f38611240f79d4456d8c8ea4b94a39e85bd114cfee9d93f837f8 not found: ID does not exist" Feb 19 20:12:11 crc kubenswrapper[4722]: I0219 20:12:11.083424 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ab54680-8998-4ed7-aa56-8196f18629c5" path="/var/lib/kubelet/pods/9ab54680-8998-4ed7-aa56-8196f18629c5/volumes" Feb 19 20:12:23 crc kubenswrapper[4722]: I0219 20:12:23.071636 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:12:23 crc kubenswrapper[4722]: E0219 20:12:23.073430 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:12:36 crc kubenswrapper[4722]: I0219 20:12:36.071877 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:12:36 crc kubenswrapper[4722]: E0219 20:12:36.072589 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:12:47 crc kubenswrapper[4722]: I0219 20:12:47.076344 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:12:47 crc kubenswrapper[4722]: E0219 20:12:47.077548 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:13:00 crc kubenswrapper[4722]: I0219 20:13:00.072414 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:13:00 crc kubenswrapper[4722]: E0219 20:13:00.074133 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:13:15 crc kubenswrapper[4722]: I0219 20:13:15.071623 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:13:15 crc kubenswrapper[4722]: E0219 20:13:15.072497 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:13:28 crc kubenswrapper[4722]: I0219 20:13:28.077294 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:13:28 crc kubenswrapper[4722]: E0219 20:13:28.079963 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:13:42 crc kubenswrapper[4722]: I0219 20:13:42.071991 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:13:42 crc kubenswrapper[4722]: E0219 20:13:42.072656 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:13:54 crc kubenswrapper[4722]: I0219 20:13:54.071445 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:13:54 crc kubenswrapper[4722]: E0219 20:13:54.072276 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:14:09 crc kubenswrapper[4722]: I0219 20:14:09.071346 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:14:09 crc kubenswrapper[4722]: E0219 20:14:09.072072 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:14:20 crc kubenswrapper[4722]: I0219 20:14:20.071440 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:14:20 crc kubenswrapper[4722]: E0219 20:14:20.072414 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:14:33 crc kubenswrapper[4722]: I0219 20:14:33.071397 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:14:33 crc kubenswrapper[4722]: E0219 20:14:33.072441 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:14:45 crc kubenswrapper[4722]: I0219 20:14:45.072144 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:14:45 crc kubenswrapper[4722]: E0219 20:14:45.073207 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:14:59 crc kubenswrapper[4722]: I0219 20:14:59.080785 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:14:59 crc kubenswrapper[4722]: E0219 20:14:59.081807 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.152356 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g"] Feb 19 20:15:00 crc kubenswrapper[4722]: E0219 20:15:00.153227 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerName="extract-utilities" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.153243 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerName="extract-utilities" Feb 19 20:15:00 crc kubenswrapper[4722]: E0219 20:15:00.153261 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerName="registry-server" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.153270 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerName="registry-server" Feb 19 20:15:00 crc kubenswrapper[4722]: E0219 20:15:00.153307 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerName="extract-content" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.153314 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerName="extract-content" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.153555 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ab54680-8998-4ed7-aa56-8196f18629c5" containerName="registry-server" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.154518 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.156805 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.157111 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.168857 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g"] Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.314347 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-secret-volume\") pod \"collect-profiles-29525535-p825g\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.314702 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsxt6\" (UniqueName: \"kubernetes.io/projected/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-kube-api-access-dsxt6\") pod \"collect-profiles-29525535-p825g\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.314976 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-config-volume\") pod \"collect-profiles-29525535-p825g\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.417627 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-secret-volume\") pod \"collect-profiles-29525535-p825g\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.417810 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsxt6\" (UniqueName: \"kubernetes.io/projected/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-kube-api-access-dsxt6\") pod \"collect-profiles-29525535-p825g\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.417978 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-config-volume\") pod \"collect-profiles-29525535-p825g\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.419202 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-config-volume\") pod \"collect-profiles-29525535-p825g\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.425261 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-secret-volume\") pod \"collect-profiles-29525535-p825g\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.438264 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsxt6\" (UniqueName: \"kubernetes.io/projected/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-kube-api-access-dsxt6\") pod \"collect-profiles-29525535-p825g\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.504967 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:00 crc kubenswrapper[4722]: I0219 20:15:00.988109 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g"] Feb 19 20:15:01 crc kubenswrapper[4722]: I0219 20:15:01.893873 4722 generic.go:334] "Generic (PLEG): container finished" podID="ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa" containerID="bd3698ac075c0a7914916910624265b1e5a98426ab24840ac159e552ca7e1514" exitCode=0 Feb 19 20:15:01 crc kubenswrapper[4722]: I0219 20:15:01.893967 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" event={"ID":"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa","Type":"ContainerDied","Data":"bd3698ac075c0a7914916910624265b1e5a98426ab24840ac159e552ca7e1514"} Feb 19 20:15:01 crc kubenswrapper[4722]: I0219 20:15:01.894249 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" event={"ID":"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa","Type":"ContainerStarted","Data":"3fb4aa9edf887cb0b6574567b0ca3a21b764ceb2a300394ec2dd282ee7979c71"} Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.393097 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.490787 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsxt6\" (UniqueName: \"kubernetes.io/projected/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-kube-api-access-dsxt6\") pod \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.490877 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-config-volume\") pod \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.490931 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-secret-volume\") pod \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\" (UID: \"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa\") " Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.492092 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-config-volume" (OuterVolumeSpecName: "config-volume") pod "ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa" (UID: "ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.495840 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-kube-api-access-dsxt6" (OuterVolumeSpecName: "kube-api-access-dsxt6") pod "ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa" (UID: "ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa"). InnerVolumeSpecName "kube-api-access-dsxt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.502364 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa" (UID: "ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.593036 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsxt6\" (UniqueName: \"kubernetes.io/projected/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-kube-api-access-dsxt6\") on node \"crc\" DevicePath \"\"" Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.593280 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.593361 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.918785 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" event={"ID":"ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa","Type":"ContainerDied","Data":"3fb4aa9edf887cb0b6574567b0ca3a21b764ceb2a300394ec2dd282ee7979c71"} Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.918832 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29525535-p825g" Feb 19 20:15:03 crc kubenswrapper[4722]: I0219 20:15:03.918866 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fb4aa9edf887cb0b6574567b0ca3a21b764ceb2a300394ec2dd282ee7979c71" Feb 19 20:15:04 crc kubenswrapper[4722]: I0219 20:15:04.498434 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf"] Feb 19 20:15:04 crc kubenswrapper[4722]: I0219 20:15:04.506913 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29525490-f85qf"] Feb 19 20:15:05 crc kubenswrapper[4722]: I0219 20:15:05.092358 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6513190-cf4a-405f-a7ca-c35f37d63725" path="/var/lib/kubelet/pods/b6513190-cf4a-405f-a7ca-c35f37d63725/volumes" Feb 19 20:15:11 crc kubenswrapper[4722]: I0219 20:15:11.077960 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:15:11 crc kubenswrapper[4722]: E0219 20:15:11.078805 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-w8zrl_openshift-machine-config-operator(b265ff4c-d096-4b39-8032-fe0b84354832)\"" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" podUID="b265ff4c-d096-4b39-8032-fe0b84354832" Feb 19 20:15:26 crc kubenswrapper[4722]: I0219 20:15:26.072920 4722 scope.go:117] "RemoveContainer" containerID="f9cb9ae704011b7c1059d4a7914221b7a107a199dbdd7bc96cf15e0d20a22def" Feb 19 20:15:27 crc kubenswrapper[4722]: I0219 20:15:27.157209 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-w8zrl" event={"ID":"b265ff4c-d096-4b39-8032-fe0b84354832","Type":"ContainerStarted","Data":"046e022c7f6b949833e8ff39bdeb53a1dc51d358998927287ef6b0062550383b"} Feb 19 20:15:46 crc kubenswrapper[4722]: I0219 20:15:46.807713 4722 scope.go:117] "RemoveContainer" containerID="1a0fba6d0ff68b77b5d4af6abf07f7a3a985db19a68b0e1561f090e9701e0cbe" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.773507 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d872l"] Feb 19 20:16:20 crc kubenswrapper[4722]: E0219 20:16:20.774641 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa" containerName="collect-profiles" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.774659 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa" containerName="collect-profiles" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.774937 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab7c1c9f-aacb-4b7f-8cd6-7bd33ce2bcaa" containerName="collect-profiles" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.777778 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.817966 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d872l"] Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.886206 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-utilities\") pod \"community-operators-d872l\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.886375 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c74zv\" (UniqueName: \"kubernetes.io/projected/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-kube-api-access-c74zv\") pod \"community-operators-d872l\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.886444 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-catalog-content\") pod \"community-operators-d872l\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.988555 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-catalog-content\") pod \"community-operators-d872l\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.988690 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-utilities\") pod \"community-operators-d872l\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.988777 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c74zv\" (UniqueName: \"kubernetes.io/projected/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-kube-api-access-c74zv\") pod \"community-operators-d872l\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.989257 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-catalog-content\") pod \"community-operators-d872l\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:20 crc kubenswrapper[4722]: I0219 20:16:20.989389 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-utilities\") pod \"community-operators-d872l\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:21 crc kubenswrapper[4722]: I0219 20:16:21.010797 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c74zv\" (UniqueName: \"kubernetes.io/projected/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-kube-api-access-c74zv\") pod \"community-operators-d872l\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:21 crc kubenswrapper[4722]: I0219 20:16:21.138561 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:21 crc kubenswrapper[4722]: I0219 20:16:21.646551 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d872l"] Feb 19 20:16:21 crc kubenswrapper[4722]: W0219 20:16:21.649087 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b92f9e9_b0bf_4870_a012_bcc485ce62c7.slice/crio-5bf7df4b54500f93d46343e348d5d6aec743b6900bea276b1ed9317495a81f44 WatchSource:0}: Error finding container 5bf7df4b54500f93d46343e348d5d6aec743b6900bea276b1ed9317495a81f44: Status 404 returned error can't find the container with id 5bf7df4b54500f93d46343e348d5d6aec743b6900bea276b1ed9317495a81f44 Feb 19 20:16:21 crc kubenswrapper[4722]: I0219 20:16:21.724940 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d872l" event={"ID":"8b92f9e9-b0bf-4870-a012-bcc485ce62c7","Type":"ContainerStarted","Data":"5bf7df4b54500f93d46343e348d5d6aec743b6900bea276b1ed9317495a81f44"} Feb 19 20:16:22 crc kubenswrapper[4722]: I0219 20:16:22.734500 4722 generic.go:334] "Generic (PLEG): container finished" podID="8b92f9e9-b0bf-4870-a012-bcc485ce62c7" containerID="09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd" exitCode=0 Feb 19 20:16:22 crc kubenswrapper[4722]: I0219 20:16:22.734566 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d872l" event={"ID":"8b92f9e9-b0bf-4870-a012-bcc485ce62c7","Type":"ContainerDied","Data":"09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd"} Feb 19 20:16:22 crc kubenswrapper[4722]: I0219 20:16:22.737288 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 20:16:23 crc kubenswrapper[4722]: I0219 20:16:23.745267 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d872l" event={"ID":"8b92f9e9-b0bf-4870-a012-bcc485ce62c7","Type":"ContainerStarted","Data":"49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb"} Feb 19 20:16:24 crc kubenswrapper[4722]: I0219 20:16:24.757604 4722 generic.go:334] "Generic (PLEG): container finished" podID="8b92f9e9-b0bf-4870-a012-bcc485ce62c7" containerID="49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb" exitCode=0 Feb 19 20:16:24 crc kubenswrapper[4722]: I0219 20:16:24.757650 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d872l" event={"ID":"8b92f9e9-b0bf-4870-a012-bcc485ce62c7","Type":"ContainerDied","Data":"49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb"} Feb 19 20:16:25 crc kubenswrapper[4722]: I0219 20:16:25.770068 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d872l" event={"ID":"8b92f9e9-b0bf-4870-a012-bcc485ce62c7","Type":"ContainerStarted","Data":"9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23"} Feb 19 20:16:25 crc kubenswrapper[4722]: I0219 20:16:25.800297 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d872l" podStartSLOduration=3.318932277 podStartE2EDuration="5.800278603s" podCreationTimestamp="2026-02-19 20:16:20 +0000 UTC" firstStartedPulling="2026-02-19 20:16:22.736998811 +0000 UTC m=+3482.349349135" lastFinishedPulling="2026-02-19 20:16:25.218345137 +0000 UTC m=+3484.830695461" observedRunningTime="2026-02-19 20:16:25.789808917 +0000 UTC m=+3485.402159241" watchObservedRunningTime="2026-02-19 20:16:25.800278603 +0000 UTC m=+3485.412628927" Feb 19 20:16:31 crc kubenswrapper[4722]: I0219 20:16:31.140081 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:31 crc kubenswrapper[4722]: I0219 20:16:31.140739 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:31 crc kubenswrapper[4722]: I0219 20:16:31.191447 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:31 crc kubenswrapper[4722]: I0219 20:16:31.933372 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:32 crc kubenswrapper[4722]: I0219 20:16:32.002411 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d872l"] Feb 19 20:16:33 crc kubenswrapper[4722]: I0219 20:16:33.872890 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d872l" podUID="8b92f9e9-b0bf-4870-a012-bcc485ce62c7" containerName="registry-server" containerID="cri-o://9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23" gracePeriod=2 Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.880178 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.882631 4722 generic.go:334] "Generic (PLEG): container finished" podID="8b92f9e9-b0bf-4870-a012-bcc485ce62c7" containerID="9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23" exitCode=0 Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.882667 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d872l" event={"ID":"8b92f9e9-b0bf-4870-a012-bcc485ce62c7","Type":"ContainerDied","Data":"9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23"} Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.882693 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d872l" event={"ID":"8b92f9e9-b0bf-4870-a012-bcc485ce62c7","Type":"ContainerDied","Data":"5bf7df4b54500f93d46343e348d5d6aec743b6900bea276b1ed9317495a81f44"} Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.882723 4722 scope.go:117] "RemoveContainer" containerID="9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23" Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.904089 4722 scope.go:117] "RemoveContainer" containerID="49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb" Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.933592 4722 scope.go:117] "RemoveContainer" containerID="09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd" Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.976837 4722 scope.go:117] "RemoveContainer" containerID="9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23" Feb 19 20:16:34 crc kubenswrapper[4722]: E0219 20:16:34.977297 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23\": container with ID starting with 9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23 not found: ID does not exist" containerID="9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23" Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.977339 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23"} err="failed to get container status \"9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23\": rpc error: code = NotFound desc = could not find container \"9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23\": container with ID starting with 9fc9bbc2c05b6c5b72b2ab7f024bd3c441e2e525957404c88c27925bf0db3e23 not found: ID does not exist" Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.977364 4722 scope.go:117] "RemoveContainer" containerID="49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb" Feb 19 20:16:34 crc kubenswrapper[4722]: E0219 20:16:34.977599 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb\": container with ID starting with 49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb not found: ID does not exist" containerID="49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb" Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.977626 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb"} err="failed to get container status \"49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb\": rpc error: code = NotFound desc = could not find container \"49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb\": container with ID starting with 49076dd4283471a85c005ee435840c39e47542104ddcba600dce000c53779bbb not found: ID does not exist" Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.977643 4722 scope.go:117] "RemoveContainer" containerID="09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd" Feb 19 20:16:34 crc kubenswrapper[4722]: E0219 20:16:34.977884 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd\": container with ID starting with 09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd not found: ID does not exist" containerID="09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd" Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.977908 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd"} err="failed to get container status \"09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd\": rpc error: code = NotFound desc = could not find container \"09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd\": container with ID starting with 09974abf59c1b4a64174b904a14f9b2c0268fe4b2ab31160f5ed65079ad0decd not found: ID does not exist" Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.991480 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-utilities\") pod \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.991763 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c74zv\" (UniqueName: \"kubernetes.io/projected/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-kube-api-access-c74zv\") pod \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.991835 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-catalog-content\") pod \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\" (UID: \"8b92f9e9-b0bf-4870-a012-bcc485ce62c7\") " Feb 19 20:16:34 crc kubenswrapper[4722]: I0219 20:16:34.992965 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-utilities" (OuterVolumeSpecName: "utilities") pod "8b92f9e9-b0bf-4870-a012-bcc485ce62c7" (UID: "8b92f9e9-b0bf-4870-a012-bcc485ce62c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:16:35 crc kubenswrapper[4722]: I0219 20:16:35.004002 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-kube-api-access-c74zv" (OuterVolumeSpecName: "kube-api-access-c74zv") pod "8b92f9e9-b0bf-4870-a012-bcc485ce62c7" (UID: "8b92f9e9-b0bf-4870-a012-bcc485ce62c7"). InnerVolumeSpecName "kube-api-access-c74zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 20:16:35 crc kubenswrapper[4722]: I0219 20:16:35.048066 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b92f9e9-b0bf-4870-a012-bcc485ce62c7" (UID: "8b92f9e9-b0bf-4870-a012-bcc485ce62c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 20:16:35 crc kubenswrapper[4722]: I0219 20:16:35.094030 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c74zv\" (UniqueName: \"kubernetes.io/projected/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-kube-api-access-c74zv\") on node \"crc\" DevicePath \"\"" Feb 19 20:16:35 crc kubenswrapper[4722]: I0219 20:16:35.094058 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 20:16:35 crc kubenswrapper[4722]: I0219 20:16:35.094068 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b92f9e9-b0bf-4870-a012-bcc485ce62c7-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 20:16:35 crc kubenswrapper[4722]: I0219 20:16:35.892391 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d872l" Feb 19 20:16:35 crc kubenswrapper[4722]: I0219 20:16:35.917140 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d872l"] Feb 19 20:16:35 crc kubenswrapper[4722]: I0219 20:16:35.930321 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d872l"] Feb 19 20:16:37 crc kubenswrapper[4722]: I0219 20:16:37.097478 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b92f9e9-b0bf-4870-a012-bcc485ce62c7" path="/var/lib/kubelet/pods/8b92f9e9-b0bf-4870-a012-bcc485ce62c7/volumes" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515145667656024472 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015145667657017410 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015145660422016512 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015145660422015462 5ustar corecore